I have a Azure Data Factory Copy Activity within a pipeline - I'm copying data from Blob container /jsons in multiple virtual folders/ to Cosmos DB. However, fringe cases exist and cannot be escaped, where files larger than 2MB are placed in the Blob storage. When the copy activity picks them, the transfer /and subsequent pipeline activities/ fail as I hit the 2MB hard limit for CosmosDB. I have tried setting up a lookup activity / get metadata but can't seem to address properly the relevant (size) property and the output necessary for the delete activity.
Can anyone advise on a an approach on how to handle this?
Thank you.