I am trying to write to MongoDB from spark , for trial purpose, I am launching spark 2 shell (Spark version=2.1.1.2.6.1.0-129) as mentioned below :-
spark-shell --jars /bigdata/datalake/mongo-spark-connector_2.11-2.1.1.jar,/bigdata/datalake/mongo-scala-driver_2.11-2.1.0.jar,/bigdata/datalake/mongo-java-driver-3.2.2.jar
And running following code into it :-
import com.mongodb.spark._
import org.apache.spark.sql.{SaveMode, SparkSession}
spark.conf.set("spark.mongodb.output.uri","mongodb://<IP>:27017/menas.tests")
spark.conf.set("spark.mongodb.output.collection", "tests")
val df = spark.sparkContext.parallelize( 1 to 10).toDF().withColumn("value",col("value").cast("string"))
MongoSpark.save(df.write.option("uri", "mongodb://<IP>:27017/menas.tests").mode("append"))
But, it's resulting into following error. Basically, I want to save content of dataframe to MongoDB .