18
votes

I use Spark 2.1.0.

When I run spark-shell, I encounter this error:

<console>:14: error: not found: value spark
       import spark.implicits._
              ^
<console>:14: error: not found: value spark
       import spark.sql
              ^

What could be the reason? How to fix it?

7
What do you get when you do only spark? - Ramesh Maharjan
Are on Windows? Did you install winutils.exe? Is this your first time executing spark-shell? - Jacek Laskowski
thanks everyone, i solve the problem already, it's an error from installation. - Selena

7 Answers

8
votes

I was facing the same issue, after investigation i observed there was the compatibility issue between spark version and winutils.exe of hadoop-2.x.x.

After experiment i suggest you to use hadoop-2.7.1 winutils.exe with spark-2.2.0-bin-hadoop2.7 version and hadoop-2.6.0 winutils.exe with spark-1.6.0-bin-hadoop2.6 version and set below environment variables

SCALA_HOME  : C:\Program Files (x86)\scala2.11.7;
JAVA_HOME   : C:\Program Files\Java\jdk1.8.0_51
HADOOP_HOME : C:\Hadoop\winutils-master\hadoop-2.7.1
SPARK_HOME  : C:\Hadoop\spark-2.2.0-bin-hadoop2.7
PATH    : %JAVA_HOME%\bin;%SCALA_HOME%\bin;%HADOOP_HOME%\bin;%SPARK_HOME%\bin;

Create C:\tmp\hive diroctory and give access permission using below command

C:\Hadoop\winutils-master\hadoop-2.7.1\bin>winutils.exe chmod -R 777 C:\tmp\hive

Remove local Derby-based metastore metastore_db directory from Computer if it exist.

C:\Users\<User_Name>\metastore_db

Use below command to start spark shell

C:>spark-shell

enter image description here

2
votes

The reason for the error is that the instance could not be created due to some earlier issues (which may have happened because you are on Windows and you have not installed winutils.exe binary or some other session keeps the local Derby-based metastore).

The recommendation is to scroll up and review the entire screen of logs where you find the root cause.

1
votes

If you are on Cloudera the soltuion from this github ticket worked for me (https://github.com/cloudera/clusterdock/issues/30):

The root user (who you're running as when you start spark-shell) has no user directory in HDFS. If you create one (sudo -u hdfs hdfs dfs -mkdir /user/root followed by sudo -u hdfs dfs -chown root:root /user/root), this should be fixed.

I.e. create a HDFS user home directory for the user running spark-shell.

1
votes

give "chmod 777" basically permission for this folder to spark access

C:\tmp\hive

Here is full command as bellow:-

C:\spark\Hadoop\bin\winutils.exe chmod 777 C:\tmp\hive

http://mytechnologythought.blogspot.com/2017/10/fixed-spark-setup-error-not-found-spark.html

0
votes

For Ubuntu users

I had the exact same error and i fixed it the following way.

If you are running spark-shell from the terminal close and re-open the terminal and then restart the spark-shell.

0
votes

If you are running Cloudera, please check in cloudera manager and make sure HIVE services are ON. I had same issue and figured my HIVE service was down. (HIVE METASTORE server, HIVESERVER, HOSTS)

for Spark, you need to make sure HDFS, YARN and HIVE are ON.

Above error appears if HIVE is OFF.

0
votes

I had the same error. In my case, the hard disk was almost full. I deleted some large files from the disk and re-run again after a reboot. It worked! But I think this is not always the case.