2
votes

I'm new to Azure Data Factory. I'm trying to solve the following problem:

  1. Read csv file from Azure Blob
  2. Parse it row by row and dump each row into an existing cosmos db

I am currently looking into a solution that does:

  1. Copy data from source (csv) to sink (Azure Storage Table)
  2. ForEach activity that parses the table and copies the rows into the db

Is this a correct approach, and if it is, how should I set up the dynamic content of the ForEach activity?


Note:

I've tried this solution (link) but I get an error message saying

Reading or replacing offers is not supported for serverless accounts

which means that

CosmosDB Serverless is not currently supported as Sink for the Data flow in Azure Data Factory.

1
Did you try use look up to get the content in the table and then foreach the row? the expression should like this: @activity('Lookup1').output.valueLeon Yue

1 Answers

1
votes

If you use Lookup + ForEach actives, the Foreach Items should be:

@activity('Lookup1').output.value

Your solution may be hard to achieve that.

Since you have found that Data Flow doesn't support Cosmos DB Serverless, I think you may can ref this tutorial: Copy Data From Blob Storage To Cosmos DB Using Azure Data Factory

It uses copy active to copy data from a csv file in Blob Storage to Azure Cosmos DB directly.