I need to sync a moderate amount of data regularly from a service into a local database in Azure.
I could write the code for that and I would have been done with this 10 times over if I did, but where's the fun in that, right?
I thought, let's use some fancy Azure services and see how that goes. It may lead to some better maintainability if I have something workflow and node-based.
So I made an ADF pipeline and a copy job to an Azure Table, which works fine. The json from the service is parsed correctly and I could insert the json fields as table columns.
In a next step I want to copy the data further to a Azure SQL Database and convert some types properly:
- One field is either missing or has one of two fixed string (not "true" or "false" though), and it should be a bit in the database.
- Another field has a very opinionated date format and needs to be parsed properly.
I don't think this works in the copy job. The mapping tab seems to be limited and the dynamic thingie ("add dynamic content" in that tab) appears to not be able to refer to the fields I need to transform.
Am I correct in assuming that I now need to use either a data flow (executed on some Java cluster I believe) or an SSIS package?
I tried creating a data flow, but it appears it can't use Azure Tables as a source - at least that source isn't offered in the respective selector.
Since I'm not even sure that a data flow is necessary, I'm asking for help at this point.