0
votes

I am uploading files to SQL Server

  1. I am streaming the files from UI layer.
  2. When it comes to Business layer here is the implementation.

    HttpWebRequest request = WebRequest.Create(postUrl) as HttpWebRequest;

    Stream requestStream = request.GetRequestStream();

    byte[] buffer = new byte[8 * 1024];

    int len;

    while ((len = fileToUpload.File.Read(buffer, 0, buffer.Length)) >0)

    {

    requestStream.Write(buffer, 0, len);

    requestStream.Flush();

    }

    return request.GetResponse() as HttpWebResponse;

I am chunking the inputstream in 8KB and sending it to the response stream.

My question is ,if the length of fileToUpload.File in the business layer is 1 GB ,so will it be stored physically in the memory and affect the performance?

Or should I chunk the files from UI to memory in terms of 8KB so that calling the Business layer function 8000 times?

2

2 Answers

1
votes

I have faced a similar problem in an enterprise document application management system. I faced issues sending large files (> 5 MB) via a web service because the encoding bloats the payload causing large memory consumption on the server side.

Solution was to drop the file on a network drive/share/path and have the server poll and pick up the file from the network path. Of course in my case the application was an intranet app, not internet

1
votes

If it was me I'd do multiple calls, however 8K is mighty small thesedays, try doing multiple calls in the megabytes range instead.

The problem with a single large POST is that if it fails mid upload you'd have to resend the entire file again. With multiple calls you can actually acknowledge the chunk has uploaded before moving on to the next.