1
votes

I want to copy data from azure blob storage to azure sql database. The destination database is divided among different tables.

So is there any way in which i directly send the blob data to different sql tables using a single pipeline in one copy activity?

As this should be a trigger based pipeline so it is a continuous process, i created trigger for every hour but right now i can just send blob data to one table and then divide them into different table by invoking another pipeline where source and sink dataset both are SQL database.

Finding a solution for this

2

2 Answers

0
votes

According to my experience and Azure Data Factory doucmentation, we could not directly send the blob data to different sql tables using a single pipeline in one copy activity.

Because during Table mapping settings, One Copy Data Active only allows us select one corresponding table in the destination data store or specify the stored procedure to run at the destination.

enter image description here

You don't need to create a new pipeline, just add a new copy data active, each copy active call different stored procedure. enter image description here

Hope this helps.

1
votes

You could use a stored procedure in your database as a sink in the copy activity. This way, you can define the logic in the stored procedure to write the data to your destination tables. You can find the description of the stored procedure sink here.

You'll have to use a user defined table type for this solution, maintaining them can be difficult, if you run into issues, you can have a look at my & BioEcoSS' answer in this thread.