1
votes

Is it possible to create a pipeline in Azure DevOps that simply runs two or more independent pipelines (in parallel or in series) and does nothing else?

I have an application consisting of five independent microservices, each of which can be built and deployed separately via its individual build pipeline (YAML). However, it is often convenient for us to build and deploy them all together; e.g., when tagging the resulting artefacts as part of a new release. I've scoured the official docs but found nothing on how to do this. Pipeline triggers don't help me because I need to keep these pipelines decoupled, as they are frequently run individually. What I need is something like this (wishful pseudo-code):

trigger: none
pr: none

stages:
- stage: Pipeline 1
  jobs:
  - job: Pipeline 1
    displayName: 'Pipeline 1'
    pool:
      vmImage: 'ubuntu-16.04'
    steps:
    // custom script to run Pipeline 1
    - script: dotnet run pipeline1
      displayName: 'Run Pipeline 1 through a script'
    // or, alternatively, an in-built task to do it
    - task: RunPipeline@4
      displayName: 'Run Pipeline 1 through a task'
      inputs:
        PipelineName: 'Pipeline 1'
        etc.: ...

- stage: Pipeline 2
  jobs:
  - job: Pipeline 2
  ...

Or it could be separate jobs within a single stage, or even separate steps within a single job - it doesn't matter. I just want an overarching "master pipeline" that will fire off all the individual pipelines and build all the services at the touch of a button.

3
I don't understand why you are saying that Pipeline triggers doesn't help. To me, with pipeline triggers, you have a parent pipeline referencing child ones, so why do you think that those pipelines become coupled ?Loul G.

3 Answers

1
votes

Azure pipeline to run two or more independent pipelines

To achieve this, we could use REST API Builds - Queue to queue other build pipelines in the master pipeline:

POST https://dev.azure.com/{organization}/{project}/_apis/build/builds?api-version=5.0

Just as the YAML file I test:

pool:

  name: MyPrivateAgent


steps:

- task: PowerShell@2

  displayName: QueueBuildForPipelineA

  inputs:
    targetType : inline
    Script: |
     $body = '
     { 
             "definition": {
                 "id": DefinitionIdHere
             } 
     }'

     $bodyJson=$body | ConvertFrom-Json
     Write-Output $bodyJson
     $bodyString=$bodyJson | ConvertTo-Json -Depth 100
     Write-Output $bodyString
     $user="test"
     $token="PAT"
     $base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))

     $Uri = "https://dev.azure.com/YourOrganization/YourProjectName/_apis/build/builds?api-version=5.0"
     $buildresponse = Invoke-RestMethod -Method Post -UseDefaultCredentials -ContentType application/json -Uri $Uri -Body $bodyString -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)}

  condition: and(always(), eq(variables['TriggerPipelineA'], 'True'))


- task: PowerShell@2

  displayName: QueueBuildForPipelineB
  ...

Besides, I also add a condition for the task condition: and(always(), eq(variables['TriggerPipelineA'], 'True')), then I define the variable TriggerPipelineA in the Variables, we could freely trigger which pipelines by overwrite the value of that variable to False during queue pipeline.

enter image description here

Hope this helps.

1
votes

I ended up achieving the result I wanted using Azure Pipeline Templates. It's not exactly what I wanted, as I first had to rewrite each individual pipeline to run from a template by moving the stage definitions into a separate template file and defining the variables in the calling pipeline (which is now nearly empty). After that, I was able to write my master pipeline to call the individual templates one after another and it worked seamlessly. They ran in series (one after the other); there might be a way to make then run in parallel but I didn't need to as I don't mind how long it takes.

Top tip for anyone who tries this: you can define variables at the top level (i.e., in the calling pipeline) and they will be available within the template files. The template files are simply stapled into the pipeline just before it's run, much like server-side includes (or #includes for anyone as old as me). You only need to use template parameters if your individual pipelines share a variable name that needs to be set differently for each one (e.g., "service-name"). Be aware that template parameters (${{}}) are evaluated before the pipeline is run (much like #define constants in C) whereas variables ($()) are evaluated at run-time; so if one or more of your pipelines aren't working as expected, you might need to turn a variable into a template parameter.

0
votes

Assuming your pipelines are all in the same project, and assuming you're using Windows-based build agents whose services are running under a Windows account that has permission to access your project in Azure DevOps (as opposed to LOCAL SYSTEM et cetera), you can run the script below as a build task for the purpose of triggering other builds, which could then trigger releases.

param(
    [string[]] $namesOfBuildsToTrigger,
    [string] $azDoUrl,
    [string] $projectCollection,
    [string] $project
)

Set-StrictMode -Version Latest
$ErrorActionPreference = "Stop";

$azDoUrl = "$azDoUrl/$projectCollection/$project"
$buildsUrl = $azDoUrl + '/_apis/build/builds?api-version=2.0'
$buildDefsUrl = $azDoUrl + '/_apis/build/definitions?api-version=2.0'

foreach ($build in $namesOfBuildsToTrigger) {
    try {
        $buildDefinitions = (Invoke-RestMethod -Uri ($buildDefsUrl) -Method GET -UseDefaultCredentials).Value
        $buildDefinitions | Where-Object { $_.name -eq $nameOfBuildToStart } | ForEach-Object {
            $body = '{ "definition": { "id": '+ $_.id + '}, reason: "Manual", priority: "Normal" }' 

            Write-Host "Queueing $($_.name)" 

            # Trigger new build 
            $result = Invoke-RestMethod -Method Post -Uri $buildsUrl -ContentType 'application/json' -Body $body -Verbose -UseDefaultCredentials
            $result
        }
    }
    catch {
        $_ | Out-File "$PSScriptRoot\ErrorLog_$(get-date -f yyyy-MM-dd-ss).log"
    }
}