8
votes

I have an intranet application that needs to upload (.iso) files that exceed 2GB. It appears there are many limiting factors to the 2GB file size.

  1. There is a browser limitation within IE and only IE 9/10 can exceed 2GB According to Eric Law
  2. The maxRequestLength element of httpRuntime is of type Int32, which has a maximum value of 2097151, approximately 2GB.

It appears you can set yet another file size limit with with maxAllowedContentLength to approximately 4GB as it is of type uint, but what good does that do when we still are being limited by 2GB from maxRequestLength?

<system.webServer>
  <security>
    <requestFiltering>
      <requestLimits maxAllowedContentLength="4294967295" />
    </requestFiltering>
  </security>
<system.webServer>

Does anyone have any solutions for uploading files past the 2GB limit?

3

3 Answers

4
votes

Are you open for JavaScript Solution??. If that's the case try this jQuery plugin which allows you to upload massive data(a lot GB). It upload files using HTML5 FileReader API features and Silverlight fallback if browser doesn't have support providing a mechanism inspired on TCP/IP send and receive packages with the corresponding ACK. The files are uploaded by chunks sized as configured(defaults to 4 MB).

Plus: It also comes with a file queue mode.

Here is a sample of how you may use it in a Razor View:

$(function () {

    var file = $("#file").createUploaderHtml5({
        postDataUrl: "@Url.Action("Upload", "Home")",
        packetSize: 4 * 1024 * 1024,
        onPreparingUpload: function (plugin, ufname, umime, usize) {
            plugin.settings.logger("ufname = [" + ufname + "] umime = [" + umime + "] usize = [" + usize + "]");
            return true;
        },
        onInitPacketArrived: function (plugin, guid) {
            plugin.settings.logger("guid = [" + guid + "]");
        },
        onDataPacketArrived: function (plugin, ack, total) {
            //plugin.settings.logger("ACK [" + ack.Guid + "] packet = [" + ack.Packet + "] total = [" + total + "]");
            var percent = Math.round(ack.Packet / total * 100);
            $("#progressbar").attr("value", percent);
            $("#percent").html(percent + " %");
        },
        onFileUploaded: function (pl) {
            pl.settings.logger("File finished!!!");
        },
        logger: function(msg) {
            var lg = $("#logger");
            lg.html(lg.html() + msg + "<br />");
        }
    });

    $("#start").click(function () {
        file.startUpload();
    });

    $("#stop").click(function () {
        file.cancelUpload();
    });

});

Here's the code for the Upload Action:

[HttpPost]
public ActionResult Upload(FormCollection collection)
{
    var packetSize = 4 * 1024 * 1024; // default to 4 MB
    var filePath = Server.MapPath("~/_temp_upload/");

    var result = UploadHelper.ProcessRequest(Request, filePath, packetSize);

        if (result != null)
        {
            var metadata = UploadHelper.GetMetadataInfo(filePath, result.Guid);
            // do anything with the metadata
        }

        if (result != null)
            return Json(result);
        return Content("");
    }
2
votes

I was able to upload upto 4 GB using Web APIs and IIS by making below changes. In web api project, below 2 changes in web.config to set max lengths.

  <requestFiltering>
    <requestLimits maxAllowedContentLength="4294967295"/>
  </requestFiltering>
  <httpRuntime targetFramework="4.5.2" executionTimeout="2400" maxRequestLength="2147483647"/>

Adding the chunked header at client side while calling web api as shown below which stops IIS from restricting files above 2 GB by streaming the file-

HttpClient.DefaultRequestHeaders.Add("Transfer-Encoding", "chunked");

Adding below code before reading the stream at server side (web api controller), to use overloaded method GetBufferlessInputStream(disableMaxLength) to ignore max request length of 2 GB-

var content = new StreamContent(HttpContext.Current.Request.GetBufferlessInputStream(true));
foreach(var header in Request.Content.Headers) {
 content.Headers.TryAddWithoutValidation(header.Key, header.Value);
}
await content.ReadAsMultipartAsync(streamProvider);

Change the policy selector in server side, so that buffering is disabled & file is streamed instead.Add below class to override WebHostBufferPolicySelector for your controller (e.g. "File" controller in snippet below)-

public class NoBufferPolicySelector: WebHostBufferPolicySelector {
 public override bool UseBufferedInputStream(object hostContext) {
  var context = hostContext as HttpContextBase;

  if (context != null && context.Request.RequestContext.RouteData.Values["controller"] != null) {
   if (string.Equals(context.Request.RequestContext.RouteData.Values["controller"].ToString(), "File", StringComparison.InvariantCultureIgnoreCase))
    return false;
  }

  return true;
 }

 public override bool UseBufferedOutputStream(HttpResponseMessage response) {
  return base.UseBufferedOutputStream(response);
 }
}

Add below into Register method-

GlobalConfiguration.Configuration.Services.Replace(typeof(IHostBufferPolicySelector), new NoBufferPolicySelector());

Hope this helps anyone out there.

1
votes

I was fighting a lot this year with large files upload from various browsers to IIS server. Here is what I found:

ASP.NET supports upload over 2Gb since .Net 4.5 (probably it supports files up to long.MaxValue). But the IIS itself does not support uploads over 2Gb. So any server hosted in IIS does not support uploads over 2Gb.

To my understanding setting maxAllowedContentLength or maxRequestLength to values over 2Gb does not help because these settings are for ASP.NET, and the core issue is in IIS.