1
votes

I am trying to build the CI/CD for Azure Data Factory using Azure DevOps. I am able to setup the Pipeline and Release. However I have a problem:

Dataset

I have 2 environments, DEV and PROD, How can I pass parameter in the CD pipeline to change the File path (e.g. dev and prod) in different deployment stage (dev and prod environment) in Sink and Source

enter image description here

Thank you for your help!

1
In my opinion, the path should be configured correctly once in the dataset in each environment, and shouldn't ever be deployed again, however, the MS recommended ADF CI/CD deploys everything including config every time. If you want to proceed with the official process, you can force the ARM template in adf_publish to generate parameters, using this process docs.microsoft.com/en-us/azure/data-factory/…Nick.McDermaid
You can also parameterise then so that are populated at run time. So have a think about when you would prefer to configure the environmental parmeters: 1. Once upon initial deployment; 2. Everytime you migrate code; 3. Everytime you runNick.McDermaid
Thank you Nick for your reply! If I go with the approach override the default by using the default parameterization template, arm-template-parameters-definition.json. For example since I have 2 environments Dev and Prod, and for each environment the dataset for the pipeline is point to different root folder, will I be able to achieve it with the template to take care both environment or I should take care every time it run? Thank you!!!burberry398
Since your objective is to follow the Azure DevOps Ci/CD, you should follow the link I posted and parameterise your dataset and set the value at deployment time in your deployment pipelineNick.McDermaid

1 Answers

1
votes

There is another approach to publish ADF, from master (collaboration) branch. You can define (replace) value for every single node (property) in json file (ADF object). It will resolve your problem as you can provide separate CSV config file per each environment (stage).

Example of CSV config file (config-stage-UAT.csv):

type,name,path,value
pipeline,PL_CopyMovies,activities[0].outputs[0].parameters.BlobContainer,UAT

Then just run such cmdlet in PowerShell:

Publish-AdfV2FromJson -RootFolder "$RootFolder" -ResourceGroupName "$ResourceGroupName" -DataFactoryName "$DataFactoryName" -Location "$Location" -Stage "stage-UAT"

Check this out: azure.datafactory.tools (PowerShell module)