0
votes

As rightly pointed out here: Spark SQL query execution on Hive

Spark SQL when running through HiveContext will make SQL query use the spark engine.

How does spark SQL setting hive.execution.engine=spark tell hive to do so?

Note this works automatically, we do not have to specify this in hive-site.xml in the conf directory of spark.

1

1 Answers

0
votes

There are 2 independent projects here

  1. Hive on Spark - Hive project that integrates Spark as an additional engine.
  2. Spark SQL - Spark module that makes use of the Hive code.

HiveContext belongs to the 2nd and hive.execution.engine is a property of the 1st.