I am working on Azure Logic App based solution to export the data from Database to CSV file.
So far, I am able to do the following:
- Use SQL database connector and execute the stored procedure to retrieve data
- Create a CSV file
- Use FTP connector to upload the file to an FTP server
With this approach, the issue is going to be data size. I am trying to figure out a way(s) to handle large data export. Here are the different approaches I can think of:
Option 1: Use pagination in SP, and iteratively collect the data.
Question 1.1: In each loop in the Logic App, multiple CSV files will be generated. How to combine the data from each iteration into a single CSV file?
Question 1.2: I am thinking of using a variable array to collect the data from each iteration, and then create CSV file from the variable array. Will I run into any issue if the size of the variable array becomes too large?
Option 2: Somewhere I read that you can overcome the data size issue with chunking.
Question: I am not even sure if the database connector supports chunking and will I able export the data into a single CSV file?
Option 3: Create multiple CSV files and then merge them into one CSV file?
Question: Is that possible to do it in Logic apps? Or will I need to implement a Logic function to handle the merging of CSV files?