1
votes

Hey I want to use spark in my Java Project :

I already add this dependency to my pom file :

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>1.4.0</version>
</dependency>

I tried this code :

import org.apache.spark.api.java.JavaSparkContext;

public class sparkSQL {
    public void query() {
        JavaSparkContext sc = new JavaSparkContext();
    }
}

I called this function in my main but I got this error :

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Cloneable at java.lang.ClassLoader.defineClass1(Native Method) at java.lang.ClassLoader.defineClass(ClassLoader.java:800) at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) at java.net.URLClassLoader.access$100(URLClassLoader.java:71) at java.net.URLClassLoader$1.run(URLClassLoader.java:361) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at org.apache.spark.SparkContext.(SparkContext.scala:111) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:56) at realtimequeries.sparkSQL.query(sparkSQL.java:7) at main.main(main.java:25) Blockquote

Caused by: java.lang.ClassNotFoundException: scala.Cloneable at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 16 more

Blockquote

I don't understand why I got this error, because normally JavaSparkContext has been created for this utilisation :

A Java-friendly version of SparkContext that returns JavaRDDs and works with Java collections instead of Scala ones.

I already take a look at the pom of the spark-core_2.11 dependency I had and it's seems that we can find a scala dependency :

http://central.maven.org/maven2/org/apache/spark/spark-hive_2.10/1.4.0/spark-hive_2.10-1.4.0.pom

Did I miss something ? What I am doing wrong ? Thanks in advance

3
I have the exact same problem, could you provide some info, on how you managed to overcome it? I'm new to the Spark-Java concept, so I would like to know, where are the pom files etc, in full detail :/Jack
I'm not working on this project anymore so I can't help you since I choose another way but you can look the answer of @Atul Soman.Spierki

3 Answers

4
votes

The class scala.Cloneable is present in scala-library*.jar. This error is gone for me after adding scala-library to pom.xml

<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>2.11.1</version>
</dependency>
1
votes

Do not mix scala versions like 2.11 and 2.12 for different dependencies (make sure you are using the same scala version for all libraries).

For e.g. spark-core_2.11 is build using 2.11 scala version. So the below would not work:

// would not work compile group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.4.4' compile group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.4.4' compile group: 'io.delta', name: 'delta-core_2.12', version: '0.4.0'

// this would work; note the change: 2.11 -> 2.12 compile group: 'org.apache.spark', name: 'spark-core_2.12', version: '2.4.4' compile group: 'org.apache.spark', name: 'spark-sql_2.12', version: '2.4.4' compile group: 'io.delta', name: 'delta-core_2.12', version: '0.4.0'

-1
votes

You can use JavaSparkContext to work with Spark from Java, but you still need scala since Spark is written in scala. Most of the operations are internally transformed to scala, or work internally with scala classes. You can program everything in Java but you will still need scala in your classpath.

So, in order to fix your error, you need to install scala and make SCALA_HOME point to the directory you installed it.