I'm looking to use azure data factory to run against a bunch of sql queries. The scenario I'm targeting is this:
- A preprocessor generates many (100+) sql files with HANA queries for extraction. These can be run against SAP HANA, each producing output that is saved. I have an application that does this, and it works.
- I'm looking to move this to Azure Data Factory, as it needs to go through an Azure Self Hosted Integration Runtime if running from Azure. In other words, if running on the cloud, the only option is the ADF SAP HANA connector (which supports the Copy Data activity).
- Ideally, I'd be able to point something to a list of sql files, execute each one, and store the results separately. If I could somehow give the connection to a notebook, or custom activity, that'd be sufficient. Is this possible?
What would be the best way to achieve this? I guess I could programmatically generate 100+ copy data activities, but that seems heavy handed, and if more files are required later, I'd be changing the pipeline definition.