2
votes

I want to update the Target csv file (Located in Azure Data Lake Store) with delta records updated every day (delta file sit in blob). If existed record updated, then I want to update the same in Target file or if the delta records is new one, then want to append that records to Target CSV file in azure data lake store. I want to implement this using Azure Data Factory, preferably using ADF Data flow.

I am trying to do this using Azure Data Factory Data Flow Task, but I observed it is possible to create new target file post the merge but couldn't able to update the existed file.

Please let me know if any powershell or any other way if we can update the target file

1

1 Answers

0
votes

We have a sample template that shows you how to update an existing file from a new file using ADF Data Flows. The file type is Parquet, but will work for CSV as well.

Go to New > Pipeline from Template and look for "Parquet CRUD Operations". You can open up that Data Flow to see how it's done.