I am trying to implement DevOps on ADF and it was successful with pipelines having activities which fetch data from ADLS location and SQL.
But now I have a pipeline in which one of the activity is to run a jar file from dbfs location as shown below.
This pipeline will run a jar file which is in the dbfs location and proceed.
The connection parameters for the cluster is as shown below.
While deploying the ARM template from dev ADF to UAT instance, which is having UAT instance of databricks, I was not able to override any of the cluster connection details from arm_template_parameter.json file.
How to configure the workspace URL and clusterID for UAT/PROD environment at the time of ARM deployment? There is no entry for any of the cluster details in the arm_template_parameter.json file.
As shown in the first picture, if there is an activity which picks the jar file from DEV instance dbfs loaction, with system generated jar file name, Will it fail when the ARM template for this pipeline is deployed in other environments? If so How to deploy the same jar file with same name in DEV/PROD databricks dbfs location?
Any leads appreciated!