0
votes

I'm currently working on a project where we shall replace an old legacy software with something new fancy Azure things :-) In short:

  • We will read an on-premise file filled with batches of transaction data

  • My LogicApp will be trigged by the file creation and the data is sent to my Azure Function for processing.

  • Split the file into separate transaction that we will store in
    Dynamics365.

I've managed to trig when a new file is created on-premise but since each batch of transaction data is quite large I have a hard time to test it by sending it to my Azure Function using Postman (POST command). Postman complains that the buffer is too large to send :-(

I've tested it with a smaller batch and I can see that the Azure Function does what it is supposed to do.

My next thought was that instead of sending the batch(es) to my function I could send the filename instead and let my function just read the batches from the file. BUT: how do I access locally created file from my Function on Azure through my On-Premise data gateway?

In my Logic App I get the file name, the file name of the local file.... How do I access that local file from my Azure Function?

The other solution I considered was to use a Logic App to just copy the On-Premise file to a temporary file on a Azure File Storage area and then continue the processing from there, but it feels like a "step-too-much".

Any other, better solutions?

1

1 Answers

0
votes

I would break your problem in to two pieces. The first is to get your file/s to Azure and the second would be to do the processing your require.

For (1), look in to using Azure Data Factory (v2) and an Integration Runtime (IR). An IR will allow you to securely access your on premise data (including files). If an integration runtime seems like overkill for your use case, then look at using the command line tool AzCopy. The goal would be to move your file as it is created on premises to an Azure Storage account. Your logic apps can then act on a trigger when the file is created in Azure Storage.

For (2), your Azure function can do the relevant processing you require. Keep in mind that Azure functions on consumption plan cannot run for longer than a maximum of 10 minutes (the default configuration is 5 minutes).