Is there a way to access a jar residing on a GEN2 DataLake storage and
do a spark-submit from Databrics workspace, or even from Azure ADF ?
(Because the communication between the workspace and GEN2 storage is
protected "fs.azure.account.key") Unfortunately, you cannot access a
jar residing on Azure Storage such as ADLS Gen2/Gen1 account.
Note: The --jars, --py-files, --files arguments support DBFS and S3 paths.
Typically, the Jar libraries are stored under dbfs:/FileStore/jars.
You need to upload libraries in dbfs and pass as the parameters in the jar activity.
For more details, refer "Transform data by running a jar activity in Azure Databricks using ADF".
Is there a way to do a spark-submit from a databricks notebook?
To answer the second question, you may refer the below Job types:
Reference: SparkSubmit and "Create a job"
Hope this helps.
If this answers your query, do click “Mark as Answer” and "Up-Vote" for the same. And, if you have any further query do let us know.