1
votes

My spark application depends on adam_2.11-0.20.0.jar, every time I have to package my application with adam_2.11-0.20.0.jar as a fat jar to submit to spark.

for example, my fat jar is myApp1-adam_2.11-0.20.0.jar,

It's ok to submit as following

spark-submit --class com.ano.adam.AnnoSp myApp1-adam_2.11-0.20.0.jar

It reported Exception in

thread "main" java.lang.NoClassDefFoundError:

org/bdgenomics/adam/rdd using --jars

spark-submit --class com.ano.adam.AnnoSp myApp1.jar --jars adam_2.11-0.20.0.jar

My question is how to submit using 2 separate jars without package them together

spark-submit --class com.ano.adam.AnnoSp myApp1.jar adam_2.11-0.20.0.jar
1

1 Answers

3
votes

Add all jars in one folder and then do like below...

Option 1 :

I think Better way of doing this is

$SPARK_HOME/bin/spark-submit \
--driver-class-path  $(echo /usr/local/share/build/libs/*.jar | tr ' ' ',') \
--jars $(echo /usr/local/share/build/libs/*.jar | tr ' ' ',') 

in this approach, you wont miss any jar by mistake in the classpath hence no warning should come.

Option 2 see my anwer:

spark-submit-jars-arguments-wants-comma-list-how-to-declare-a-directory

Option 3 : If you want to do programmatic submit by adding jars through API its possible.Here Im not going to details of it.