0
votes

I have a problem connecting to my postgresql 8.4 db using Apache Spark service on Bluemix.

My code is:

%AddJar https://jdbc.postgresql.org/download/postgresql-8.4-703.jdbc4.jar -f
val sqlContext = new org.apache.spark.sql.SQLContext(sc)

sqlContext.load("jdbc", Map("url" -> "jdbc:postgresql://<ip_address>:5432/postgres?
user=postgres&password=<password>", "dbtable" -> "table_name"))

And I get the error:

Name: java.sql.SQLException

Message: No suitable driver found for jdbc:postgresql://:5432/postgres?user=postgres&password=

I've read around and it seems I need to add the JDBC driver to the Spark class path. I've no idea how to do this in the Bluemix Apache Spark service.

3

3 Answers

0
votes

There is currently an issue with adding JDBC drivers to Bluemix Apache Spark. The team is working to resolve it. You can follow the progress here: https://developer.ibm.com/answers/questions/248803/connecting-to-postgresql-db-using-jdbc-from-bluemi.html

0
votes

Possibly have a look here? I believe the load() function is deprecated in Spark 1.4 [source].

You could try this instead

val url = "jdbc:postgresql://:5432/postgres"
val prop = new java.util.Properties
prop.setProperty("user","postgres")
prop.setProperty("password","xxxxxx")

val table = sqlContext.read.jdbc(url,"table_name",prop)

The url may or may not require the completed version - i.e.

jdbc:postgresql://:5432/postgres? user=postgres&password=password

0
votes

This worked for me on Bluemix

%AddJar https://jdbc.postgresql.org/download/postgresql-9.4.1208.jar -f

val sqlContext = new org.apache.spark.sql.SQLContext(sc);

val df = sqlContext.read.format("jdbc").options(Map("url" -> "jdbc:postgresql://:/", "user" -> "", "password" -> "","dbtable" -> "", "driver" -> "org.postgresql.Driver")).load()