1
votes

I have a event driven logic app (blob event) which reads a block blob using the path and uploads the content to Azure Data Lake. I noticed the logic app is failing with 413 (RequestEntityTooLarge) reading a large file (~6 GB). I understand that logic apps has the limitation of 1024 MB - https://docs.microsoft.com/en-us/connectors/azureblob/ but is there any work around to handle this type of situation? The alternative solution I am working on is moving this step to Azure Function and get the content from the blob. Thanks for your suggestions!

3
are you using eventgrid or a blob trigger in your logic app ?Thomas
Event Grid which monitors for the new blob creation. Thanks!Satya Azure

3 Answers

1
votes

If you want to use an Azure function, I would suggest you to have a look at this at this article:

There is a standalone version of the AdlCopy tool that you can deploy to your Azure function.

So your logic app will call this function that will run a command to copy the file from blob storage to your data lake factory. I would suggest you to use a powershell function.

Another option would be to use Azure Data Factory to copy file to Data Lake:

You can create a job that copy file from blob storage:

There is a connector to trigger a data factory run from logic app so you may not need azure function but it seems that there is still some limitations:

0
votes

You should consider using Azure Files connector:https://docs.microsoft.com/en-us/connectors/azurefile/

It is currently in preview, the advantage it has over Blob is that it doesn't have a size limit. The above link includes more information about it.

0
votes

For the benefit of others who might be looking for a solution of this sort. I ended up creating an Azure Function in C# as the my design dynamically parses the Blob Name and creates the ADL structure based on the blob name. I have used chunked memory streaming for reading the blob and writing it to ADL with multi threading for adderssing the Azure Functions time out of 10 minutes.