I'm using the latest HDP Sandbox (2.4.0.0-169). I have written below code in Spark-shell (Spark Version 1.6.0)
var orcData = sqlContext.sql("select code from sample_07");
var paymentDataCache = orcData.cache;
paymentDataCache.registerTempTable("paymentDataCache");
Followed below commands to start thrift server and beeline
1) export SPARK_HOME=/usr/hdp/2.4.0.0-169/spark/
2) sudo ./sbin/start-thriftserver.sh --master yarn-client --executor-memory 512m --hiveconf hive.server2.thrift.port=10015
3) ./bin/beeline
4) !connect jdbc:hive2://localhost:10015
Now If I execute show tables, I'm expecting to see paymentDataCache temporary table. Please find attached screen shot.
I also tried to start the thrift server using
sudo ./sbin/start-thriftserver.sh --master yarn-client --executor-memory 512m --hiveconf hive.server2.thrift.port=10015 --conf spark.sql.hive.thriftServer.singleSession=true
but no luck.
We tried the same process in HDP (2.3.2.0-2950 with Spark 1.4.1) 9 node cluster but we do not see temporary tables in Spark beeline.