ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: "DB_X"."table_Y"
Spark session :
SparkSession
.builder()
.appName(appName)
.config("spark.sql.warehouse.dir", "/apps/hive/warehouse")
.enableHiveSupport()
.getOrCreate();
Hive warehouse directory in hive-site.xml : /apps/hive/warehouse/
hadoop fs -ls /apps/hive/warehouse/
drwxrwxrwx - root hadoop 0 2018-09-03 11:22 /apps/hive/warehouse/DB_X.db
hadoop fs -ls /apps/hive/warehouse/DB_X.db
none
Error is throw here :
spark
.read()
.table("DB_X.table_Y");
in java :
spark.sql("show databases").show()
default
in spark-shell interactive :
spark.sql("show databases").show()
default
DB_X
show create table table_Y :
CREATE EXTERNAL TABLE `table_Y`(
...
PARTITIONED BY (
`partition` string COMMENT '')
...
location '/data/kafka-connect/topics/table_Y'
hadoop files :
hadoop fs -ls /data/kafka-connect/topics/table_Y
drwxr-xr-x - kafka hdfs 0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0
drwxr-xr-x - kafka hdfs 0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=1
hadoop fs -ls data/kafka-connect/topics/table_Y/partition=0
-rw-r--r-- 3 kafka hdfs 102388 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001823382+0001824381.avro
-rw-r--r-- 3 kafka hdfs 102147 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001824382+0001825381.avro
...
everythings work fine in spark-shell or hive-shell
hive-site.xml from hive conf is copied in spark2/conf
using HDP 2.6.4.0-91 with spark 2.2
any help ?
spark.sql("select * from DB_X.table_Y limit 10")
– serge_kspark.sql("show tables in DB_X")
– serge_k