1
votes

I am trying to upload a file (~250Mb size) to Azure File Shares (not blob storage) and the upload fails after multiple retry attempts and throws and exception saying that the request failed after 6 retries. I faced this exact same problem with uploading files to the azure blob storage and I found out that I needed to reduce the number of concurrent threads on BlobUploadOptions since my network speed could not handle a large number of parallel threads for upload. Now for uploading to Azure File Shares, I am not able to find the property where I can set the number of maximum concurrency for upload. Any ideas about how I can set that? Or any alternate solutions? P.S. I'm using .NET Azure SDK v12

The code I'm using:

                string shareName = "test-share";
                string dirName = "sample-dir";
                string fileName = Path.GetFileName(localFilePath);

                ShareClient share = new ShareClient(ConnectionString, shareName);
                await share.CreateAsync();

                ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
                await directory.CreateAsync();

                ShareFileClient fileClient = directory.GetFileClient(fileName);
                using (FileStream stream = File.OpenRead(localFilePath))
                {
                    await fileClient.CreateAsync(stream.Length);
                    await fileClient.UploadRangeAsync(
                        new HttpRange(0, stream.Length),
                        stream);
                }

I had solved the problem while uploading to blob storage like this:

                     BlobUploadOptions uploadOptions = new BlobUploadOptions() {
                     TransferOptions = new Azure.Storage.StorageTransferOptions() {
                         MaximumConcurrency = 2,
                         InitialTransferSize = 100 * 1024 * 1024
                     }
                };

                using (FileStream uploadFileStream = File.OpenRead(filePath))
                {
                    await blobClient.UploadAsync(uploadFileStream, uploadOptions);
                    uploadFileStream.Close();
                }
1

1 Answers

0
votes

After going through the source code of .NET Azure SDK v12, there is no such settings for file share.

As a workaround, you can chunk the files first, then upload these chunked files one by one. In this case, there is no concurrency. The sample code is as below:

        //other code            

        ShareFileClient fileClient = directory.GetFileClient(fileName);
        using (FileStream stream = File.OpenRead(localFilePath))
        {
            await fileClient.CreateAsync(stream.Length);

            int blockSize = 1 * 1024 * 1024;
            long offset = 0;//Define http range offset
            BinaryReader reader = new BinaryReader(stream);
            while (true)
            {
                byte[] buffer = reader.ReadBytes(blockSize);
                if (buffer.Length == 0)
                    break;

                MemoryStream uploadChunk = new MemoryStream();
                uploadChunk.Write(buffer, 0, buffer.Length);
                uploadChunk.Position = 0;

                HttpRange httpRange = new HttpRange(offset, buffer.Length);
                var resp = await fileClient.UploadRangeAsync(httpRange, uploadChunk);
                offset += buffer.Length;//Shift the offset by number of bytes already written
            }

            reader.Close();
        }

Btw, there is an Azure Storage Data Movement Library, if you choose to use it for file uploading, you have the chance to control concurrency.