0
votes

Azure Data factory V2 - Copy Activity - Copy data from Changing Column names and number of columns to Destination. I have to copy data from a Flat File where number of Columns will change in each file and even the column names. How do I dynamically map them in Copy Activity to load the data into destination in Azure Data factory V2.

Suppose my destination has 20 columns, but source will come sometimes as 10 columns or 15 or sometimes 20. If the source columns are less than destination then remaining column values in destination should be passed as Null.

1
If you need to be completely flexible the easiest way is to use an activity that triggers an external tool such as an azure function or a databricks notebook. I use Databricks for large files with dynamic content. For smaller files, an azure function would be the better. What kind of sink are you writing to?Martin
Sink - Azure SQL DatabasePuskar
I agree with Mark Kromer. This might be the easiest way for Azure SQL Database sinks.Martin

1 Answers

1
votes

Use data flows in ADF. Data Flow sinks can generate the table schema on the fly if you wish. Or you can just "auto-map" any changing schema to your target. If your source schema changes often, just use "schema drift" with no schema defined in your dataset.