0
votes

I am trying to run Java/Scala (maven project) from IntelliJ (community 2019.2) on Mackbook pro.

My java:

  Java 12.0.2 2019-07-16
  Java(TM) SE Runtime Environment (build 12.0.2+10)
  Java HotSpot(TM) 64-Bit Server VM (build 12.0.2+10, mixed mode, sharing)

my code:

  SparkConf conf = new SparkConf().setAppName("test").setMaster("local"); // error !

  JavaSparkContext sc = new JavaSparkContext(conf);

My pom:

  <properties>
    <scala.version>2.11.12</scala.version>
    <scala.binary.version>2.11.12</scala.binary.version>
    <java.version>12</java.version>
    <maven.javadoc.skip>true</maven.javadoc.skip>
</properties>

 <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-xml</artifactId>
        <version>2.11.0-M4</version>
    </dependency>


  <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.4.5</version>
        <scope>compile</scope>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.module</groupId>
        <artifactId>jackson-module-scala_2.11</artifactId>
        <version>2.4.5</version>
    </dependency>

  <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.7.0</version>
        <scope>compile</scope>
    </dependency>

My scala and spark are all 2.11, why I still got the error :

 Exception in thread "main" java.lang.NoSuchMethodError: 
 scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

Based on

 https://stackguides.com/questions/43845831/nosuchmethoderror-scala-predef-conformslscala-predeflesscolonless

I should not have this version mismatch ?

thanks

UPDATE I have installed java11 based on https://medium.com/w-logs/installing-java-11-on-macos-with-homebrew-7f73c1e9fadf

but, I got the same error.

 /Library/Java/JavaVirtualMachines/openjdk-11.0.2.jdk/Contents/Home/bin/java -agentlib:jdwp=transport=dt_socket,address=...

 Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

thanks

1

1 Answers

0
votes

Java 12 is compatible with Java 13. And Java 13 is compatible with Scala versions 2.13.1 and 2.12.9. If possible, use at most Java 11 (which is compatible with Scala versions 2.13.0, 2.12.4, 2.11.12, 2.10.7 so you should be fine with your 2.11.12.

Nonetheless, it seems that Spark 3.0 is going to support JDK9+ SPARK-24417


Also: please change scala.binary.version in your pom.xml, it should be

<scala.binary.version>2.11</scala.binary.version>

instead of

<scala.binary.version>2.11.12</scala.binary.version>

With that change you can call dependencies without explicit Scala version, that is instead of

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.4.5</version>
    <scope>compile</scope>
</dependency>

you can say

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_${scala.binary.version}</artifactId>
    <version>2.4.5</version>
    <scope>compile</scope>
</dependency>