I want to use a maven package in a Databricks Job, which shall run on a new automated Cluster. Regular interactive clusters have the option to install a maven package. This installation resolves all dependencies of this package. On automated cluster you only can assign downloaded jars to be installed on startup of the cluster.
My problem is, that the dependencies of this jar are missing. Of course I can download them and add them to the cluster, but the dependency-tree seems to be pretty large. Can I just download a jar with all dependencies included (did not found one)? Or can I install my the package in another way?
The package I need is azure-eventhubs-spark.