48
votes

I'm trying to upload a image in Windows Azure Blob and I'm geting the following error which I can't handle.

Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

The error occurs when I try to create a container.

   container.CreateIfNotExists()

Here is my code

try
{
    Microsoft.WindowsAzure.Storage.CloudStorageAccount storageAccount = Microsoft.WindowsAzure.Storage.CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]);
    CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); 

    // Retrieve a reference to a container. 
    CloudBlobContainer container = blobClient.GetContainerReference("samples");

    // Create the container if it doesn't already exist.
    // here is the error
    if (container.CreateIfNotExists())
    {
        container.SetPermissions(
            new BlobContainerPermissions
            {
                PublicAccess = BlobContainerPublicAccessType.Blob
            });
    }
    
    CloudBlockBlob blockBlob = container.GetBlockBlobReference("Image1");
    using (var fileStream = System.IO.File.OpenRead(@"Path"))
    {
        blockBlob.UploadFromStream(fileStream);
    }
}
catch (StorageException ex1)
{
    throw ex1;
}

I have tried a lot of options in my code but still getting the error.

16
Which version of storage client library are you using? Are you getting this error when trying to create a container in cloud or local storage emulator? If it is local storage emulator, which version of emulator are you using?Gaurav Mantri
OK. Please check for 2 things - 1) your account key is correct and 2) Clock on your computer is correct. These are the two reasons which could result in this error.Gaurav Mantri
I see ... I don't think this matters. What you need to see is if your computer's clock is slower than the GMT time. Please check the GMT time on your computer (DateTime.UtcNow) and compare it with the actual GMT time (you would need to find a site which will tell you correct GMT time). If the difference is more than 15 minutes, then you will get this error.Gaurav Mantri
I don't think time is the problem, my current Utc time is + {30/06/2014 15:23:57} , and i check in this site the GMT time wwp.greenwichmeantime.com/info/current-time and seems to be the same. :/Fábio Henrique
Then please check the account key.Gaurav Mantri

16 Answers

33
votes

My PC's time was off by 1 hour as suggested by others in the comments. Correcting it solved the problem.

10
votes

I am using .NET SDK for Azure blob file uploading with metadata. I got an error while uploading files into Azure Blob storage with metadata, the error is "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature." But these errors were only a few files not all of them.

Issue here If you have metadata for the file, metadata should not contain special characters(�) or additional space( ) starting of the value and end of the value.

If you correct the metadata values then the file will upload successfully.

9
votes

I got this message when I was trying to access BLOB Storage through REST API Endpoint.

Below is the response that I got when invoked list container operation with Authorization header

<?xml version="1.0" encoding="utf-8"?>
<Error>
    <Code>AuthenticationFailed</Code>
    <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:096c6d73-f01e-0054-6816-e8eaed000000
Time:2019-03-31T23:08:43.6593937Z</Message>
    <AuthenticationErrorDetail>Authentication scheme Bearer is not supported in this version.</AuthenticationErrorDetail>
</Error>

solution was to include below header

x-ms-version: 2017-11-09
6
votes

In my case I was passing storage connection string with access signature as an argument to console application. '%' in command line is an special character 'command line parameters'. '%' appears in access signature (SAS). You have to escape percent %, double it %%.

5
votes

in my case it was actually the shared access signature (SAS) that expired. updating (actually making a new one) the shared access signature in portal.azure.com by adding a year (or more) for end date in the future. And all problems fixed.

4
votes

ERROR MESSAGE


Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

SOLUTION


I was facing the same issue in my application and i resovled it by generating a shared access signature (in azure portal) for key2 instead of key1. Changing the key fixed the error. (Settings > Shared access signature) Also keep in mind that the connection string should be updated as well - if used. (Settings > Access keys)

2
votes

My webapp didn't get access to Table Storage when running as an Azure App Service, even though the connectionstring was exactly the same as I used during development on my local machine.

To solve the problem, I added a system assigned identity to my Azure App Service and gave it Storage Account Contributor role on the Storage Account.

The application was .NET Core 3.1 using .NET library WindowsAzure.Storage version 9.3.2.

Add System assigned identity to Azure App Service Assign Storage Account Contributor role to Store Account

1
votes

Check the timezone of your computer or mobile phone.

1
votes

In my case (python), had to encode the upload content from string to bytes. Not sure why the error message says this though:

azure.core.exceptions.ClientAuthenticationError: 
Server failed to authenticate the request. 
Make sure the value of Authorization header is formed correctly including the signature.

Here is what worked.

Versions:

$ python -V
Python 3.7.7
$ pip list | grep azure
azure-core               1.8.1
azure-storage-blob       12.4.0

and the python function to upload:

def upload_blob(blob_service_client, content: str, container_name, blob_name):
    blob_client = blob_service_client.get_blob_client(container_name, blob_name)
    try:
        content_bytes = content.encode('utf-8')
        blob_client.upload_blob(content_bytes)
        return True
    except Exception as e:
        logger.error(traceback.format_exc())
        return False        
1
votes

We ran into the same error when tried to connect to a Storage account from a simple Azure AppService. After lot of investigation, we asked the official Microsoft Support to help. They checked our resources and infrastructure in Azure, and pointed out that the problem is related to Application Insights, here is the official answer:

there is an additional header ‘x-ms-request-root-id‘ in the request which gets added after the authorization header is signed for storage client request. As a result, when the request is authenticated by Azure Storage it results into 403. This seems to be getting added by the Application Insights, if the dependency tracking is enabled..

So after disabling the dependency tracking in /dev/wwwroot/ApplicationInsights.config, the error disappeared, and AppService can connect to storage account without any problem.

1
votes

(python) I've had the same error on a resource GET request with a SAS token, but when trying to GET via local machine (python, browser etc) it always worked fine. I saw @gmaklari's and @user2243747 comments which led me to use requests lib instead of urlretrieve (My initial intent was to add headers to the request). It works fine now. didn't have to add any headers.

0
votes

I migrated an app from one machine to another, migrating the connection string across. The public IP was unchanged and the URL was not expired, yet I got this error.

It appears that once used a connection string URL is tied to that machine - perhaps other fingerprints are added on first use.

I have no evidence for this but it worked after generating a new connection string.

0
votes

I was getting the same error, but what is really strange, I was getting the error in 2 of 3 storage accounts that I was running some code with. What fixed this for me is to update the Azure.Storage.Files.DataLake library to a preview version 12.2.2. That fixed the issue. Tried all other suggestions, time sync, etc. None of it worked. Really weird issue.

0
votes

Ir your debugging locally, but connecting to a remote azure storage and get this error check that your AzureWebJobsStorage string is up to date in your local.settings.json. Seems there are multiple reasons why this can happen and this is one of them.

0
votes

I had the same error. Storage connection worked when debugging locally but didn't work when deployed on Azure App Service. I had deleted a storage account with same name a bit earlier and recreated it on different Azure subscription. Could there be a bug in Azure, such that even though the resource name was made available again, it was not "freed correctly" inside Azure whatever that means?

When I created a storage account with a different name, I didn't get the error anymore.

0
votes

If you see this error while using PUT request, make sure new blob name is provided in URL.

https://{storageAccount}.blob.core.windows.net/{containerName}/{**NEW BLOB NAME TO CREATE**}?{SAS Token}