I have an application, a REST server built on Netty that embeds spark sql, and hivecontext to do analytical queries. Everything works fine on IntelliJ when running the service. But I build an uber jar that contains tha whole thing. And can't get it to run, because Hive can't instantiate its MetaStoreclient. After digging it seems that hive can't resolve the datanucleus dependencies. I run my application as
java -jar app.jar
I have tried to add Datanucleus jars with java -cp ... with no luck. The Spark doc recommends running this with --jars flags but still no luck. Since I guess I'm not using spark-submit here.
Any help is very much appreciated. Thanks.
Edit : To answer the question below, Yes I am initiating Spark in local mode for now as master = local[*]. There's a hive-site.xml in $SPARK_HOME/conf/. When run in IntelliJ it works fine, hive creates a local metastore on the project directory, spits its log to derby.log. The issue seems to happen when starting the web server in a shaded jar where the SparkContext and HiveContext are instantiated.