I have a local hadoop single node and hive installed and I have some hive tables stored in hdfs. Then I configure Hive with MySQL Metastore. And now I installed spark and Im doing some queries over hive tables like this (in scala):
var hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
result = hiveContext.sql("SELECT * FROM USERS");
result.show
Do you know how to configure spark to show to the execution time of the query? Because for default it is not showing..