1
votes

I have an API in Azure function that uses Http trigger to receive data and send that across to the on-prem application. We have the UI front-end where user can upload large file size (no limit) and that would send data in chunks to the API.

I am aware that the limitation of the function App is 100MB and I also see the recommendation for handling large file size is to use Blob Storage. However for synchronous process, we wanted to achieve this through the API (avoiding storing data in blob storage in the intermediate process).

Is there a way how I can receive data in chunks via the Http trigger request? eg: UI -> send data in chunks -> API (uses HttpTrigger) read data in chunks and send data in chunks -> on-prem.

Eg: public static async Task Run( [HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req, ILogger log)

here how do I read data from req in chunks if the sender is streaming data in chunks already ? If I read something like below, it works fine as long as the payload is <=100mb and fails with the error "data too large" if it is over 100mb.

_bufferSize = 1048576;
byte[] datapayLoad;
while (dataRead >0)
{
dataRead = req.Body.Read(datapayLoad, 0, _bufferSize);
....
}

Appreciate your help in this regard.

2
have u found solution? I too have same requirementjamir

2 Answers

0
votes

I think one solution could be to place an Azure Event Hubs between your UI/App and your Azure Function API. That way you could capture each event sending a chunk of data and trigger your function properly.

0
votes

Is it required to read data in chunks ? If the sender can write a file (in chunks or at once) and publish an event once file is completely written that would be great. Your Function can trigger on that event and read the file from blob storage directly.

Explicitly chunk reading could introduce unwanted complexity and may not be robust in long run. Just a suggestion.