1
votes

I have a data factory pipeline which takes the following parameters

  • Param1
  • Database Server 1 Name
  • Database Server 2 Name
  • Database Server 1 Username
  • Database Server 2 Username
  • etc

My pipeline decides via some logic which database server to do an import from.

Essentially I want to deploy 2 versions of my pipeline. 1 Runs in dev and the other in prod.

I want to release a dev and prod version of my pipeline via Azure Devops. Each environment release should provide (via key vault) the values of:

  • Database Server 1 Name
  • Database Server 2 Name
  • Database Server 1 Username
  • Database Server 2 Username

First prize would be if those values did not even show up any more as parameters in the pipeline. So that triggers would just have to provide Param1. In addition if you manually run the pipeline I also just want to provide Param1.

EDIT: Note that I use the parameters eventually in a paramaterized linked service if that makes a difference (https://docs.microsoft.com/en-us/azure/data-factory/parameterize-linked-services).

1

1 Answers

0
votes

I think the key idea to resolve your problem is to use two separate instances of data factory.

In the DEV enironment you have your parameterized connection as you stated above. When taking the code to PROD, you export the template and import it again into the other instance. There you have an additional config file that fills up the values needed to set up the connection properly.

If you want to avoid having the credentials stored in the config file then just add an azure key vault linked service and parameterize the secret identifier accordingly. When you import the template into PROD you even do not need to provide any parameter but the identifier for which secret to grab from key vault.

See here for more info:

devops integration

key vault integration