I am running spark in intelliJ remotely but facing difficulties while adding dependency to spark conf.
val conf = new SparkConf()
.setMaster("spark://IP:7077")
.set("packages", "com.databricks:spark-avro_2.10:2.0.1:jar")
.setAppName("localtrial")
Error:
16/02/23 12:27:10 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 172.16.248.156): java.lang.ClassNotFoundException: com.databricks.spark.avro.AvroRelation$$anonfun$buildScan$1$$anonfun$3 at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ```
I have also tried setJars property of conf class. Any help would be appreciated.