1
votes

I have installed apache-spark on a single node. When I run the spark-shell I get the below exception. I can still create RDDs and run scala code snippets inspite of the exception.

This is the exception:

16/02/15 14:21:29 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/02/15 14:21:31 WARN : Your hostname, Rahul-PC resolves to a loopback/non-reachable address: fe80:0:0:0:c0c1:cd2e:990d:17ac%e
java.lang.RuntimeException: java.lang.NullPointerException
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:171)
        at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:162)
        at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:160)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:167)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
        at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
        at $iwC$$iwC.<init>(<console>:9)
        at $iwC.<init>(<console>:18)
        at <init>(<console>:20)
        at .<init>(<console>:24)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

My JAVA_HOME is set to point to the right jdk installation folder.

JAVA_HOME = C:\Program Files\Java\jdk1.8.0

Is there anything else I need to do. Please advise.

1
Seems similar to this question: stackoverflow.com/q/32721647/1395437Daniel Zolnai
when are you getting this? starting of spark shell or while executing some command?vishnu viswanath

1 Answers

0
votes

I found the solution. Spark needs winutils.exe in order to initialize hive context. The C:\Windows\tmp folder that created when running the spark shell needs to have sufficient permissions as well.

http://blogs.msdn.com/b/arsen/archive/2016/02/09/resolving-spark-1-6-0-quot-java-lang-nullpointerexception-not-found-value-sqlcontext-quot-error-when-running-spark-shell-on-windows-10-64-bit.aspx