I am referring to the following link : Hive Support for Spark
It says :
"Spark SQL supports a different use case than Hive."
I am not sure why that will be the case. Does this mean as a Hive user i cannot use Spark execution engine through Spark SQL?
Some Questions:
- Spark SQL uses Hive Query parser. So it will ideally support all of Hive functionality.
- Will it use Hive Metastore?
- Will Hive use the Spark optimizer or will it build its own optimizer?
- Will Hive translate MR Jobs into Spark? Or use some other paradigm?