2
votes

I have Azure Data pipeline where I have to pass a parameter to Databricks activity. I have multiple Event based triggers (Updation of blob folder) added for that activity. When specific trigger gets activated, it should pass a parameter to Databricks activity and based on that notebook should run. Is there any way to pass parameter from Event based trigger to Databricks notebook activity?

1

1 Answers

2
votes

Trigger gives out 2 parameters.

  • @triggerBody().fileName

  • @triggerBody().folderPath

You will have to add this to JSON code of trigger

        "parameters": {
            "FPath": "@triggerBody().folderPath"
        }

Use this parameter as Pipeline variable @triggerBody().FPath and use that variable with other activities. Please refer to link below for detailed explanation

https://www.mssqltips.com/sqlservertip/6063/create-event-based-trigger-in-azure-data-factory/