0
votes

I'm converting a website from a standard ASP.NET website over to use Azure. The website had previously taken an Excel file uploaded by an administrative user and saved it on the file system. As part of the migration, I'm saving this file to Azure Storage. It works fine when running against my local storage through the Azure SDK. (I'm using version 1.3 since I didn't want to upgrade during the development process.)

When I point the code to run against Azure Storage itself, though, the process usually fails. The error I get is: System.IO.IOException occurred

  Message=Unable to read data from the transport connection: The connection was closed.
  Source=Microsoft.WindowsAzure.StorageClient
  StackTrace:
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.get_Result()
       at Microsoft.WindowsAzure.StorageClient.Tasks.Task`1.ExecuteAndWait()
       at Microsoft.WindowsAzure.StorageClient.CloudBlob.UploadFromStream(Stream source, BlobRequestOptions options)
       at Framework.Common.AzureBlobInteraction.UploadToBlob(Stream stream, String BlobContainerName, String fileName, String contentType) in C:\Development\RateSolution2010\Framework.Common\AzureBlobInteraction.cs:line 95
  InnerException: 

The code is as follows:

public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
        string contentType)
    {
        // Setup the connection to Windows Azure Storage
        CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());

        DiagnosticMonitorConfiguration dmc = DiagnosticMonitor.GetDefaultInitialConfiguration();
        dmc.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1);
        dmc.Logs.ScheduledTransferLogLevelFilter = LogLevel.Verbose;
        DiagnosticMonitor.Start(storageAccount, dmc);      
        CloudBlobClient BlobClient = null;
        CloudBlobContainer BlobContainer = null;
        BlobClient = storageAccount.CreateCloudBlobClient();

        // For large file copies you need to set up a custom timeout period
        // and using parallel settings appears to spread the copy across multiple threads
        // if you have big bandwidth you can increase the thread number below
        // because Azure accepts blobs broken into blocks in any order of arrival.
        BlobClient.Timeout = new System.TimeSpan(1, 0, 0);
        Role serviceRole = RoleEnvironment.Roles.Where(s => s.Value.Name == "OnlineRates.Web").First().Value;
        BlobClient.ParallelOperationThreadCount = serviceRole.Instances.Count;  

        // Get and create the container
        BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
        BlobContainer.CreateIfNotExist();

        //delete prior version if one exists
        BlobRequestOptions options = new BlobRequestOptions();
        options.DeleteSnapshotsOption = DeleteSnapshotsOption.None;
        CloudBlob blobToDelete = BlobContainer.GetBlobReference(fileName);
        Trace.WriteLine("Blob " + fileName + " deleted to be replaced by newer version.");
        blobToDelete.DeleteIfExists(options);

        //set stream to starting position
        stream.Position = 0;
        long totalBytes = 0;
        //Open the stream and read it back.
        using (stream)
        {
            // Create the Blob and upload the file
            CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
            try
            {
                BlobClient.ResponseReceived += new EventHandler<ResponseReceivedEventArgs>((obj, responseReceivedEventArgs)
                =>
                {
                    if (responseReceivedEventArgs.RequestUri.ToString().Contains("comp=block&blockid"))
                    {
                        totalBytes += Int64.Parse(responseReceivedEventArgs.RequestHeaders["Content-Length"]);
                    }
                });                 
                blob.UploadFromStream(stream);
                // Set the metadata into the blob
                blob.Metadata["FileName"] = fileName;
                blob.SetMetadata();
                // Set the properties
                blob.Properties.ContentType = contentType;
                blob.SetProperties();
            }
            catch (Exception exc)
            {
                Logging.ExceptionLogger.LogEx(exc);
            }
         }
     }

I've tried a number of different alterations to the code: deleting a blob before replacing it (although the problem exists on new blobs as well), setting container permissions, not setting permissions, etc.

2
There are some changes I would make to the code but they shouldn't cause the problem you're seeing. Do you actually need totalBytes? Have you tried without the response received event handler?knightpfhor
The event handler was thrown in just to see if anything at all was going with the transfer. It doesn't seem to make a difference in performance.Paul Smith Jr
Also, I should ask: this code is running in a separate project that gets compiled into the web role as a DLL. (The web role is just a web solution project.) Does that make a difference? Would the code need to be actually in the role itself? (I doubt it, because it does work occasionally, but thought I should ask.)Paul Smith Jr
can you show a bit of how your Azure Blob connection is configured - e.g. have you definitely got http/https setup configured OK within the project files. What does GetConnStr() return? - but please don't post your actual access key :)Stuart
also, what is BlobClient.ParallelOperationThreadCount = serviceRole.Instances.Count; for?Stuart

2 Answers

0
votes

Your code looks like it should work, but it has lots of extra functionality that is not strictly required. I would cut it down to an absolute minimum and go from there. It's really only a gut feeling, but I think it might be the using statement giving you grief. This enture function could be written (presuming the container already exists) as:

public void UploadToBlob(Stream stream, string BlobContainerName, string fileName,
                string contentType)
            {
                // Setup the connection to Windows Azure Storage
                CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnStr());
                CloudBlobClient BlobClient = storageAccount.CreateCloudBlobClient();
                CloudBlobContainer BlobContainer = BlobClient.GetContainerReference(BlobContainerName);
                CloudBlockBlob blob = BlobContainer.GetBlockBlobReference(fileName);
                stream.Position = 0;
                blob.UploadFromStream(stream);
            }

Notes on the stuff that I've removed:

  • You should set up diagnostics just once when you're app starts, not every time a method is called. Usually in the RoleEntryPoint.OnStart()
  • I'm not sure why you're trying to set ParallelOperationThreadCount higher if you have more instances. Those two things seem unrelated.
  • It's not good form to check for the existence of a container/table every time you save something to it. It's more usual to do that check once when your app starts or to have a process external to the website to make sure all the required containers/tables/queues exist. Of course if you're trying to dynamically create containers this is not true.
0
votes

The problem turned out to be firewall settings on my laptop. It's my personal laptop originally set up at home and so the firewall rules weren't set up for a corporate environment resulting in slow performance on uploads and downloads.