We have a bunch of files in azure blob storage as a tsv format and we want to move them to destination which is ADLS Gen 2 and parquet format. We want this activity on daily basis. So the ADF pipeline will write bunch of parquet files in folders which will have date in them. for example
../../YYYYMMDD/*.parquet
On the other side we have API which will access this. How does the API know that the data migration is completed for a particular day or not?
Basically is there an in built ADF feature to write done file or _SUCCESS file which API can rely on?
Thanks