I am trying to implement DevOps on Azure Data Factory and Azure Databricks.
I have completed the devops implementation for ADF DevOps and Databricks notebook files.
After deployment it seems like there are some issues with ADF pipelines which are fetching jar files stored in dbfs location.
One of the pipeline is shown below.
the path to the jar file is
dbfs:/FileStore/jars/xxx............xxx1_0_SNAPSHOT-c073a.jar
After deploying the ADF pipelines into the PROD environment, where it is pointing to the PROD databricks instance, the jar file with same name will not be available. This is causing the pipeline to fail in PROD.
How to fetch the jars from DEV dbfs location and deploy this into PROD dbfs location with same name in order to make the ADF pipeline run?
Below given is the method I am now following to implement DevOps.
- I have created a build pipeline which point to the git repository and build the jar files by executing pom.xml file
- Created a release pipeline which will copy the jar files from build artifact to
FileStore/jars/
- Now the ADF pipeline will point to the jar available in
FileStore/jars/
Is there any alternative method to resolve this or is this the proper approach?