1
votes

I am facing the issue of remote server returned error 400, bad request while uploading files to Azure as block blobs. But the strange thing is sometimes the code is worked for uploading a particular file and some time it failed for the same file.

My code is like --

        List<string> blockIdList = new List<string>();

        using (var file = new FileStream(_path, FileMode.Open, FileAccess.Read))
        {
            int blockId = 0;

            int blockSize = 4096;
            // open file
            while (file.Position < file.Length)
            {
                // calculate buffer size (blockSize in KB)   
                long bufferSize = blockSize * 1024 < file.Length - file.Position ? blockSize * 1024 : file.Length - file.Position;
                byte[] buffer = new byte[bufferSize];
                // read data to buffer
                file.Read(buffer, 0, buffer.Length);
                // save data to memory stream and put to storage
                using (var stream = new MemoryStream(buffer))
                {
                    // set stream position to start
                    stream.Position = 0;
                     convert block id to Base64 Encoded string      
                    var blockIdBase64 = Convert.ToBase64String(System.BitConverter.GetBytes(blockId));

                    blockBlob.PutBlock(blockIdBase64, stream, null);
                    blockIdList.Add(blockIdBase64);
                    // increase block id
                    blockId++;
                }
            }

            blockBlob.PutBlockList(blockIdList);

            file.Close();
        } 

Don't know why this error is throwing and looking for possible solution.

Thanks

2

2 Answers

5
votes

One thing I noticed is that you're using integer value as blockId. This could be one reason why your upload is failing because length of all blockIds must be same. So your upload code would work if the file is being split into 10 blocks (blockId = 0 - 9). However if the file is split into more than 10 blocks, the upload would fail.

My recommendation would be to pad string with 0s so that all the blockIds would be of same length. Since you can split a blob into a maximum of 50,000 blocks doing blockId.ToString("d6") should do the trick.

You may also find this blog post useful: http://gauravmantri.com/2013/05/18/windows-azure-blob-storage-dealing-with-the-specified-blob-or-block-content-is-invalid-error/.

0
votes

I too was facing this problem. I gave a few incorrect parameters to the AzCopy command and that was it - every new AzCopy I issued started giving that frustrating error. Looked up a bunch of stuff on the internet including Gaurav Mantri's blog post. He talks about 'committing' uncommitted blocks.

One easy way I found to 'Purge' every damn block from a container was to use this tool called "Azure Storage Explorer". It seemed to display all blocks - just selected them and nuked them all. Post this delete my AzCopy worked peacefully !

(Note that these invalid or uncommitted blocks do not show up on the azure management portal - wonder why doesn't the azure team support that directly. It is quite a PITA :-/