I have a stream analytics job which constantly dumps data in Cosmos DB. The payload has a property "Type" which determines the payload itself. i.e. which columns are included in the payload. It is an integer value of either 1 or 2.
I'm using Azure Data Factory V2 to copy data from Cosmos DB to Data Lake. I've created a pipeline with an activity that does this job. I'm setting the output path folder name using :
@concat('datafactoryingress/rawdata/',dataset().productFilter,'/',formatDateTime(utcnow(),'yyyy'),'/')
What I want in the datafactory is to identify the payload itself, i.e. determine if the type is 1 or 2 and then determine if the data goes in folder 1 or folder 2. I want to iterate the data from Cosmos DB and determine the message type and segregate based on message Type and set the folder paths dynamically.
Is there a way to do that? Can I check the Cosmos DB document to find out the message type and then how do I set the folder path dynamically based on that?