5
votes

For a project I'm using Azure blob storage to store uploaded image files. I'm also displaying the uploaded images on the website - however, that's where things go wrong. Every other request to an image in blobstorage results in a 400 - Multiple condition headers not supported.

Reading up on this error eventually leads me to the following documentation about specifying conditional request headers: http://msdn.microsoft.com/en-us/library/windowsazure/dd179371.aspx

That page says the following about specifying multiple conditional headers:

If a request specifies both the If-None-Match and If-Modified-Since headers, the request is evaluated based on the criteria specified in If-None-Match.

If a request specifies both the If-Match and If-Unmodified-Since headers, the request is evaluated based on the criteria specified in If-Match.

With the exception of the two combinations of conditional headers listed above, a request may specify only a single conditional header. Specifying more than one conditional header results in status code 400 (Bad Request).

I believe the requests sent by chrome meet all the requirements outlined by this documentation, and yet I receive that error.

Does anyone have experience with Azure blob storage that might help overcome this issue? I'd be most grateful!

The request as sent by Chrome:

The request as sent by chrome

The XML response as returned by the blob storage service:

The XML response from the blob storage service

1
Could there be a proxy/cache server between you and Azure that is adding it's own conditional headers? What happens when you do the same test from a different network (ie. from an Azure VM)? - kwill
Also, can you tell us how big are the files stored in blob storage? - Gaurav Mantri
The files are max. 2mb in size, just normal image upload really. The proxy/cache server was an interesting suggestion. I tested it from an Azure VM and was not able to reproduce the error, so I assume the shared office space that I'm in must add its own cache headers. That adds a bit of a problem of its own however, the website we're working on is supposed to be used by employees in medium to big companies - the type of companies that might also employ proxy/cache servers for their office WiFi. Is there something we could do to work around this? (short of routing requests through our webrole) - Rogier Pennink
have you tried using https? - viperguynaz

1 Answers

2
votes

Yes it is because of cache, at least it was because of cache for me. I am using SAS so my solution was to add a extra parameter to avoid caching.

/// token = SharedAccessSignature
    string tick = $"&{ DateTimeOffset.UtcNow.Ticks}";
    Uri url = new Uri(file.StorageUri.PrimaryUri.ToString() + token + tick);

The extra parameter should be ignored by the web application.