3
votes

I have installed the Spark 1.5.2 build with Hive on a Linux machine. The default path for the Hive metastore warehouse directory is: /user/hive/warehouse.

  1. Is this a local path or a path to the HDFS? I ask this, because I couldn't search this path in Linux.
  2. If it's an HDFS path (most likely), then can we access it without having installed Hadoop with/without a Spark build?
3

3 Answers

0
votes

you can create warehouse in local as well in your hive-site.xml put

hive.metastore.warehouse.dir file:///tmp

-1
votes

Yes, /user/hive/warehouse is an HDFS path. You'll need to install and run Hadoop services (at least namenode, secondary namenode, datanode) to make HDFS available.

-1
votes

Type jps and see if all services are running. if running check metastore_db is present in the path /user/hive/warehouse