1
votes

I have server part on Asp.net core which receives a file in Content-Type: multipart/form-data format header and send it to azure blob storage in a stream. But when I'm sending file about 200 MB and more I have an error

“The request body is too large and exceeds the maximum permissible limit”

as I searched it could happen in the old version of WindowsAzure.Storage but I have version 9.1.1. And as I looked deeper method UploadFromStreamAsync chank blob on 4 MB by default. So I don't know what to only to ask your help. My controller:

public async Task<IActionResult> Post(string folder)
    {
        string azureBlobConnectionString = _configuration.GetConnectionString("BlobConnection");
        // Retrieve storage account from connection string.
        CloudStorageAccount storageAccount = CloudStorageAccount.Parse(azureBlobConnectionString);

        HttpResponseUploadClass responseUploadClass = await Request.StreamFile(folder, storageAccount);
        FormValueProvider formModel = responseUploadClass.FormValueProvider;

        var viewModel = new MyViewModel();

        var bindingSuccessful = await TryUpdateModelAsync(viewModel, prefix: "",
            valueProvider: formModel);

        if (!bindingSuccessful)
        {
            if (!ModelState.IsValid)
            {
                return BadRequest(ModelState);
            }
        }

        return Ok(responseUploadClass.Url);
    }

And class where i send stream filestream to azure blob

 public static async Task<HttpResponseUploadClass> StreamFile(this HttpRequest request, string folder, CloudStorageAccount blobAccount)
    {
        CloudBlobClient blobClient = blobAccount.CreateCloudBlobClient();
        CloudBlobContainer container = blobClient.GetContainerReference(folder);
        CloudBlockBlob blockBlob = null;

        if (!MultipartRequestHelper.IsMultipartContentType(request.ContentType))
        {
            throw new Exception($"Expected a multipart request, but got {request.ContentType}");
        }
        var formAccumulator = new KeyValueAccumulator();

        var boundary = MultipartRequestHelper.GetBoundary(
            MediaTypeHeaderValue.Parse(request.ContentType),
            DefaultFormOptions.MultipartBoundaryLengthLimit);
        var reader = new MultipartReader(boundary, request.Body);

        var section = await reader.ReadNextSectionAsync();
        while (section != null)
        {
            ContentDispositionHeaderValue contentDisposition;
            var hasContentDispositionHeader = ContentDispositionHeaderValue.TryParse(section.ContentDisposition, out contentDisposition);
            var disposition = ContentDispositionHeaderValue.Parse(section.ContentDisposition);
            if (hasContentDispositionHeader)
            {
                if (MultipartRequestHelper.HasFileContentDisposition(contentDisposition))
                {
                    try
                    {
                        string fileName = HttpUtility.UrlEncode(disposition.FileName.Value.Replace("\"", ""), Encoding.UTF8);
                        blockBlob = container.GetBlockBlobReference(Guid.NewGuid().ToString());
                        blockBlob.Properties.ContentType = GetMimeTypeByWindowsRegistry(fileName);
                        blockBlob.Properties.ContentDisposition = "attachment; filename*=UTF-8''" + fileName;
                        await blockBlob.UploadFromStreamAsync(section.Body);
                    }
                    catch (Exception e)
                    {
                        Console.WriteLine(e);
                        throw;
                    }
                }
                else if (MultipartRequestHelper.HasFormDataContentDisposition(contentDisposition))
                {
                    var key = HeaderUtilities.RemoveQuotes(contentDisposition.Name);
                    var encoding = GetEncoding(section);
                    using (var streamReader = new StreamReader(
                        section.Body,
                        encoding,
                        detectEncodingFromByteOrderMarks: true,
                        bufferSize: 1024,
                        leaveOpen: true))
                    {
                        var value = await streamReader.ReadToEndAsync();
                        if (String.Equals(value, "undefined", StringComparison.OrdinalIgnoreCase))
                        {
                            value = String.Empty;
                        }
                        formAccumulator.Append(key.Value, value);

                        if (formAccumulator.ValueCount > DefaultFormOptions.ValueCountLimit)
                        {
                            throw new InvalidDataException($"Form key count limit {DefaultFormOptions.ValueCountLimit} exceeded.");
                        }
                    }
                }
            }
            section = await reader.ReadNextSectionAsync();
        }
        var formValueProvider = new FormValueProvider(
            BindingSource.Form,
            new FormCollection(formAccumulator.GetResults()),
            CultureInfo.CurrentCulture);

        return new HttpResponseUploadClass{FormValueProvider = formValueProvider, Url = blockBlob?.Uri.ToString()};
    }
1

1 Answers

1
votes

As you have said, each block blob can be a different size, up to a maximum of 100 MB (4 MB for requests using REST versions before 2016-05-31), and a block blob can include up to 50,000 blocks.

If you are writing a block blob that is no more than 256 MB (64 MB for requests using REST versions before 2016-05-31) in size, you can upload it in its entirety with a single write operation, see Put Blob.

Storage clients default to a 32 MB maximum single block upload, settable using the SingleBlobUploadThresholdInBytes property. When a block blob upload is larger than the value in this property, storage clients break the file into blocks. You can set the number of threads used to upload the blocks in parallel using the ParallelOperationThreadCount property.

BlobRequestOptions requestoptions = new BlobRequestOptions()
{
    SingleBlobUploadThresholdInBytes = 1024 * 1024 * 50, //50MB
    ParallelOperationThreadCount = 12,
};

CloudStorageAccount account = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting("StorageConnectionString"));
CloudBlobClient blobclient = account.CreateCloudBlobClient();
blobclient.DefaultRequestOptions = requestoptions;
CloudBlobContainer blobcontainer = blobclient.GetContainerReference("uploadfiles");
blobcontainer.CreateIfNotExists();
CloudBlockBlob blockblob = blobcontainer.GetBlockBlobReference("bigfiles");

For more details, you could refer to this thread.