0
votes

I have a Azure Data Factory Copy Activity within a pipeline - I'm copying data from Blob container /jsons in multiple virtual folders/ to Cosmos DB. However, fringe cases exist and cannot be escaped, where files larger than 2MB are placed in the Blob storage. When the copy activity picks them, the transfer /and subsequent pipeline activities/ fail as I hit the 2MB hard limit for CosmosDB. I have tried setting up a lookup activity / get metadata but can't seem to address properly the relevant (size) property and the output necessary for the delete activity.

Can anyone advise on a an approach on how to handle this?

Thank you.

1

1 Answers

1
votes

It should be possible to get the size of files in Get Metadata activity.But please note it is in bytes and only could be applied on the file.

enter image description here

As i know,no way to avoid 2mb limitation of cosmos db document.You could refer to this case:What is the size limit of a single document stored in Azure Cosmos DB