3
votes

I am new to mongodb. I am trying to extract data from mongodb as Spark Dataframe.

I am using MongoDB Connector for Spark
link: https://docs.mongodb.com/spark-connector/master/

I am following steps from this website: https://docs.mongodb.com/spark-connector/master/scala/datasets-and-sql/
The program compiles successfully but gives the following runtime error:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/ConnectionString
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at scala.util.Try$.apply(Try.scala:192)
at com.mongodb.spark.config.MongoCompanionConfig$class.connectionString(MongoCompanionConfig.scala:278)
at com.mongodb.spark.config.ReadConfig$.connectionString(ReadConfig.scala:39)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:51)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:124)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:113)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:67)
at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:50)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:307)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
at ScalaDemo.HelloWorld$.main(HelloWorld.scala:25)
at ScalaDemo.HelloWorld.main(HelloWorld.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.ConnectionString
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 18 more


Following is the maven snippet

<dependencies>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.2.1</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.2.1</version>
</dependency>
<dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.11</artifactId>
    <version>2.2.1</version>
</dependency>

The code: package ScalaDemo

import com.mongodb.spark._
import com.mongodb.spark.config._

object HelloWorld {
def main(args: Array[String]): Unit = {
import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder()
  .master("local")
  .appName("MongoSparkConnectorIntro")
  .config("spark.mongodb.input.uri", "mongodb://localhost/admin.partnerCompanies")
  .config("spark.mongodb.output.uri", "mongodb://localhost/admin.partnerCompanies")
  .getOrCreate()
 val df1= spark.read.format("com.mongodb.spark.sql").load()
 df1.show()
  }
}

Please help

1

1 Answers

1
votes

looks like it's not related to spark, your exception is

Exception in thread "main" java.lang.NoClassDefFoundError:com/mongodb/ConnectionString

means it can't find a class that connects to mongo. try to add the mongo UberJar

<dependencies>
    <dependency>
        <groupId>org.mongodb</groupId>
        <artifactId>mongo-java-driver</artifactId>
        <version>3.0.4</version>
    </dependency>
</dependencies>