I have followed this https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-Configurationpropertydetails
Have executed:
set spark.home=/location/to/sparkHome;
set hive.execution.engine=spark;
set spark.master= Spark-Master-URL
However, on running ./hive
i am getting the above error:-
Cannot find hadoop installation: $HADOOP_HOME or $HADOOP_PREFIX must be set or hadoop must be in the path
I do not have Hadoop installed, and want to run hive on top of spark running on standalone. Is it mandatory that i need to have HADOOP set up to run hive over spark?