How to store data either in Parquet or Jason files in "Azure Datalake Gen2 blob storage" before purge data in Azure SQL table. What are the steps & services required to achieve this.
1 Answers
0
votes
You can use Copy activity in Azure Data Factory to store data from SQL table as parquet files. Rather simplified, but here are the steps just to give you an idea.
- If source sql table is in-house, install Azure Integration Runtime on in-house server.
- Connect to source table using linked service and dataset using IR from above step.
- Connect to Azure Gen 2 stortage from ADF, again using linked service.
- Configure copy activity with source and sink.
Read more details here