1
votes

The Copy Data activity in Azure Data Factory appears to be limited to copying to only a single destination table. I have a spreadsheet containing rows that should be expanded out to multiple tables which reference each other - what would be the most appropriate way to achieve that in Data Factory?

Would multiple copy tasks running sequentially be able to perform this task, or does it require calling a custom stored procedure that would perform the inserts? Are there other options in Data Factory for transforming the data as described above?

1
Just to be in the same page, do you mean something like multicasting? in other words, having logical copies of a single source (excel) into multiple tables (multiple sinks)?dim_user

1 Answers

0
votes

If the columnMappings in your source and sink dataset don't against the error conditions mentioned in this link,

1.Source data store query result does not have a column name that is specified in the input dataset "structure" section.

2.Sink data store (if with pre-defined schema) does not have a column name that is specified in the output dataset "structure" section.

3.Either fewer columns or more columns in the "structure" of sink dataset than specified in the mapping.

4.Duplicate mapping.

you could connect the copy activities in series and execute them sequentially.

Another solution is Stored Procedure which could meet your custom requirements.About configuration,please refer to my previous detailed case:Azure Data Factory mapping 2 columns in one column