Both Spark and Hive are working fine individually but when I try to write the output of a Spark Dataframe to a Hive table, I am getting the below error :
Caused by: org.apache.hadoop.hive.metastore.api.MetaException: Hive Schema version 1.2.0 does not match metastore's schema version 2.1.0 Metastore is not upgraded or corrupt
The details of "schematool -dbType postgres -info" are attached below : Schematool Results Screenshot
Additional Note : From this Databricks Spark documentation page, I found out that Apache Spark supports multiple versions of hive from 0.12 upto 1.2.1 only.
So the only way for me to connect is to downgrade my Hive version ? Or is there any other provision for us to add additional jars that enables to write Spark-2.1.0 DataFrames into Hive-2.1.1 tables ?
Appreciate your opinions on this. Thanks in advance.