0
votes

I am trying to write to MongoDB from spark , for trial purpose, I am launching spark 2 shell (Spark version=2.1.1.2.6.1.0-129) as mentioned below :-

spark-shell --jars /bigdata/datalake/mongo-spark-connector_2.11-2.1.1.jar,/bigdata/datalake/mongo-scala-driver_2.11-2.1.0.jar,/bigdata/datalake/mongo-java-driver-3.2.2.jar

And running following code into it :-

import com.mongodb.spark._
import org.apache.spark.sql.{SaveMode, SparkSession}
spark.conf.set("spark.mongodb.output.uri","mongodb://<IP>:27017/menas.tests")
spark.conf.set("spark.mongodb.output.collection", "tests")
val df = spark.sparkContext.parallelize( 1 to 10).toDF().withColumn("value",col("value").cast("string"))
MongoSpark.save(df.write.option("uri", "mongodb://<IP>:27017/menas.tests").mode("append"))

But, it's resulting into following error. Basically, I want to save content of dataframe to MongoDB . enter image description here

2
Could you post the error message ?Wan Bachtiar
@WanBachtiar I have edited post with an error message. Can it be issue with version of spark and versions of jar, j am using.Vinitkumar
Your versions should match that which is provided in the documentation here. docs.mongodb.com/spark-connector/masterJay Gordon
@JayGordon version for mongo spark connector is matching as specified in link. However, I am not sure about other two jars..Vinitkumar

2 Answers

1
votes

spark-shell --jars /bigdata/datalake/mongo-spark-connector_2.11-2.1.1.jar,/bigdata/datalake/mongo-scala-driver_2.11-2.1.0.jar,/bigdata/datalake/mongo-java-driver-3.2.2.jar

Based on the error log, and the way spark-shell is invoked, this is because you're trying to import and use MongoDB Java driver v3.2.2. The Spark connector v2.1.1 has a dependency on MongoDB Java driver v3.4.2. See also mongo-spark v2.1.1 Dependencies.scala.

Instead of specifying the jars manually, you could use --packages to specify MongoDB Spark Connector. This way the dependencies will be automatically fetched. For example, to use MongoDB Spark connector version 2.1.1:

./bin/spark-shell --packages org.mongodb.spark:mongo-spark-connector_2.11:2.1.1

This will automatically fetch a MongoDB Java driver compatible with the connector.

You should see output similar as below:

:: loading settings :: url = jar:file:/home/ubuntu/spark-2.1.2-bin-hadoop2.6/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
org.mongodb.spark#mongo-spark-connector_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
    confs: [default]
    found org.mongodb.spark#mongo-spark-connector_2.11;2.1.1 in central
    found org.mongodb#mongo-java-driver;3.4.2 in central
downloading https://repo1.maven.org/maven2/org/mongodb/spark/mongo-spark-connector_2.11/2.1.1/mongo-spark-connector_2.11-2.1.1.jar ...
    [SUCCESSFUL ] org.mongodb.spark#mongo-spark-connector_2.11;2.1.1!mongo-spark-connector_2.11.jar (1291ms)
downloading https://repo1.maven.org/maven2/org/mongodb/mongo-java-driver/3.4.2/mongo-java-driver-3.4.2.jar ...
    [SUCCESSFUL ] org.mongodb#mongo-java-driver;3.4.2!mongo-java-driver.jar (612ms)
:: resolution report :: resolve 4336ms :: artifacts dl 1919ms
    :: modules in use:
    org.mongodb#mongo-java-driver;3.4.2 from central in [default]
    org.mongodb.spark#mongo-spark-connector_2.11;2.1.1 from central in [default]

For more information see also MongoDB Spark Connector Scala Guide

0
votes

add import org.bson.Document and if it does not work post your maven or sbt dependencies definitions