0
votes

I have followed this https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-Configurationpropertydetails

Have executed:

set spark.home=/location/to/sparkHome;

set hive.execution.engine=spark;

set spark.master= Spark-Master-URL

However, on running ./hive i am getting the above error:-

Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path

I do not have Hadoop installed, and want to run hive on top of spark running on standalone. Is it mandatory that i need to have HADOOP set up to run hive over spark?

1

1 Answers

0
votes

IMHO hive cannot run without the hadoop. There may be VM's which have pre installed everything. Hive will run on top of Hadoop. So First you need to install Hadoop and then you can try hive.

Please refer this https://stackoverflow.com/a/21339399/5756149.

Anyone Correct me if I am wrong