I'm currently working on a project where we shall replace an old legacy software with something new fancy Azure things :-) In short:
We will read an on-premise file filled with batches of transaction data
My LogicApp will be trigged by the file creation and the data is sent to my Azure Function for processing.
- Split the file into separate transaction that we will store in
Dynamics365.
I've managed to trig when a new file is created on-premise but since each batch of transaction data is quite large I have a hard time to test it by sending it to my Azure Function using Postman (POST command). Postman complains that the buffer is too large to send :-(
I've tested it with a smaller batch and I can see that the Azure Function does what it is supposed to do.
My next thought was that instead of sending the batch(es) to my function I could send the filename instead and let my function just read the batches from the file. BUT: how do I access locally created file from my Function on Azure through my On-Premise data gateway?
In my Logic App I get the file name, the file name of the local file.... How do I access that local file from my Azure Function?
The other solution I considered was to use a Logic App to just copy the On-Premise file to a temporary file on a Azure File Storage area and then continue the processing from there, but it feels like a "step-too-much".
Any other, better solutions?