i could run spark-sql with spark in standalone mode perfectly,but when it comes to yarn mode.spark told me that it cant find the hive class(some basic ones like org/apache/hadoop/hive/ql/plan/TableDesc).
so i added hive libs to compute-classpath.sh. failed. then i thought if yarn dont work and standalone works fine. maybe i should change the yarn classpath to include hive lib.
then i failed again.
i just dont understand that the hive libs occurs in my yarn startup log and spark output, why my hive sql told me the basic hive classes not found?
thanks all for helping me
--jarswithspark-submit?) - on yourSparkContext? (how do you build yourSparkConfig?) - Francois G