1
votes

I am trying to create an Event Trigger in Azure Data Factory on Blob Created in my Azure Storage Container. The challenge I am facing in doing so is what if I receive multiple files in one single event say 10 files.

Now what happens is the event is fired 10 times and all these 10 files are executed at-least 100 times by the Data Factory.

Edit -- My pipeline does xml to json transformation.

enter image description here

The error I get

enter image description here

enter image description here

The issue I am facing is similar to Azure Data Factory - Event based triggers on multiple files/blobs

1
Hey @amitagarwal, can you please clarify your ask here? And also is there any fixed number for the files which you would receive? or timings - Nandan
Can you not just capture the metadata of the files within the container and then execute whatever process is required on each file via a foreach loop? - iamdave
In theory, creating a blob will execute a pipeline run. Can you show us your pipeline logic? - Joseph Xu
@Josephxu -- please check my edited question. - amit agarwal
Hi @amit agarwal, so when a xml file was uploaded to the container, it will fires a pipeline run to convert xml file into json file? - Joseph Xu

1 Answers

0
votes

I've created a simple test. When I uploaded 10 xml files, it fires

  1. Set Wildcard file path as *.xml. enter image description here

  2. Then I created a event trigger. enter image description here

  3. When I uploaded 10 xml files at one time via Azure storage Explorer. I can see 10 pipeline runs. enter image description here And 10 blob in the output container. enter image description here

So what are the differences between yours and mine?