2
votes

I've set up a Azure Data Factory pipeline containing a copy activity. For testing purposes both source and sink are Azure Blob Storages.

I wan't to execute the pipeline as soon as a new file is created on the source Azure Blob Storage.

I've created a trigger of type BlovEventsTrigger. Blob path begins with has been set to //

I use Cloud Storage Explorer to upload files but it doesn't trigger my pipeline. To get an idea of what is wrong, how can I check if the event is fired? Any idea what could be wrong?

Thanks

5

5 Answers

2
votes

Reiterating what others have stated:

  • Must be using a V2 Storage Account
  • Trigger name must only contain letters, numbers and the '-' character (this restriction will soon be removed)
  • Must have registered subscription with Event Grid resource provider (this will be done for you via the UX soon)
  • Trigger makes the following properties available @triggerBody().folderPath and @triggerBody().fileName. To use these in your pipeline your must map them to pipeline paramaters and use them as such: @pipeline().parameters.paramaetername.

Finally, based on your configuration setting blob path begins with to // will not match any blob event. The UX will actually show you an error message saying that that value is not valid. Please refer to the Event Based Trigger documentation for examples of valid configuration.

1
votes

There seems to be a bug with Blob storage trigger, if you have more than one trigger is allocated to the same blob container, none of the triggers will fire.

For some reasons (another bug, but this time in Data factories?), if you edit several times your trigger in the data factory windows, the data factory seems to loose track of the triggers it creates, and your single trigger may end up creating multiple duplicate triggers on the blob storage. This condition activates the first bug discussed above: the blob storage trigger doesn't trigger anymore.

To fix this, delete the duplicate triggers. For that, navigate to your blob storage resource in the Azure portal. Go to the Events blade. From there you'll see all the triggers that the data factories added to your blob storage. Delete the duplicates. enter image description here

0
votes

If you're creating your trigger via arm template, make sure you're aware of this bug. The "runtimeState" (aka "Activated") property of the trigger can only be set as "Stopped" via arm template. The trigger will need to be activated via powershell or the ADF portal.

0
votes

And now, on 20.06.2021, same for me: event trigger is not working, though when editing it's definition in DF, it shows all my files in folder, that matches. But when i add new file to that folder, nothing happens!