2
votes

In Azure Data Factory, how do I trigger a pipeline after other pipelines completed successfully?

In detail:

I seek to trigger an SSIS package after other pipelines completed successfully. I already know I can save my SSIS package as a pipeline and run it using a trigger like the other pipelines. But how do I make sure the SSIS package pipeline starts only after the other pipelines are finished? Is there a feature for this in Azure or do I need some kind of workaround for this?

Thanks in advance~

2

2 Answers

3
votes

You could always create a parent pipeline that uses execute pipeline and execute SSIS package activities. ADF V2 has the concept of dependencies, so have a dependency between the execute pipeline activity and the execute SSIS package activity. Make sure to check the Wait on Completion box for the execute pipeline activity so that they run in sequence rather than in parallel. You can have multiple dependencies for an activity, so if you need SSIS to wait on 3 packages instead of just one, that should still work.

enter image description here

Then instead of triggering the other pipeline(s) and SSIS package separately, you can just trigger the parent pipeline instead.

1
votes

Based on your descriptions,i think you could monitor azure data factory pipelines execution status programmatically.

Please add the following code to continuously check the status of the pipeline run until it finishes by it's RunId.

PipelineRun pipelineRun;
while (true)
{
    pipelineRun = client.PipelineRuns.Get(resourceGroup, dataFactoryName, runResponse.RunId);
    Console.WriteLine("Status: " + pipelineRun.Status);
    if (pipelineRun.Status == "InProgress")
        System.Threading.Thread.Sleep(15000);
    else
        break;
}

And starts your SSIS package if the pipeline runs successfully.

if (pipelineRun.Status == "Succeeded")
   //..do your business

More details,please refer to this document.