I am running Spark v1.0.1 with build-in Hive (Spark install with SPARK_HIVE=true sbt/sbt assembly/assembly)
I also config Hive to store Metastore in PostgreSQL database as instruction:
I could config Hive (not build-in with Spark) to use PostgreSQL but I don't know how to get it work with Hive in Spark
In the instruction, I see that I need to put or link postgresql-jdbc.jar to hive/lib so that Hive could include the postgresql-jdbc when it run
$ sudo yum install postgresql-jdbc
$ ln -s /usr/share/java/postgresql-jdbc.jar /usr/lib/hive/lib/postgresql-jdbc.jar
With Build-in Hive in Spark, where should I put the postgresql-jdbc.jar to get it work?