Consider a data processing pipeline as follows:
- Fetch a large amount of data from a REST API that's hosted somewhere on the internet and persist it to a data store.
- Perform some complex data transformations on the persisted data.
- Persist the results of the data transformations on a data store.
Aiming to implement such a pipeline in Azure, steps 2 and 3 seem like a good fit for implementation as Azure Data Factory activities.
My questions is - Does it make sense to implement step 1 in an Azure Data Factory activity as well?
Technically it might be possible to code a .Net activity that perform the data download and persistence.