1
votes

ApplicationMaster: User class threw exception: org.apache.spark.sql.AnalysisException: Table or view not found: "DB_X"."table_Y"

Spark session :

  SparkSession
    .builder()          
    .appName(appName)
    .config("spark.sql.warehouse.dir", "/apps/hive/warehouse")
    .enableHiveSupport()
    .getOrCreate();

Hive warehouse directory in hive-site.xml : /apps/hive/warehouse/

hadoop fs -ls /apps/hive/warehouse/
drwxrwxrwx   - root hadoop          0 2018-09-03 11:22 /apps/hive/warehouse/DB_X.db


hadoop fs -ls /apps/hive/warehouse/DB_X.db
none

Error is throw here :

spark
   .read()
   .table("DB_X.table_Y");

in java :

spark.sql("show databases").show()
default

in spark-shell interactive :

spark.sql("show databases").show()
default
DB_X

show create table table_Y :

CREATE EXTERNAL TABLE `table_Y`(
...
PARTITIONED BY (
  `partition` string COMMENT '')
...
    location '/data/kafka-connect/topics/table_Y'

hadoop files :

hadoop fs -ls /data/kafka-connect/topics/table_Y
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0
drwxr-xr-x   - kafka hdfs          0 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=1

hadoop fs -ls data/kafka-connect/topics/table_Y/partition=0
-rw-r--r--   3 kafka hdfs     102388 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001823382+0001824381.avro
-rw-r--r--   3 kafka hdfs     102147 2018-09-11 17:24 /data/kafka-connect/topics/table_Y/partition=0/table_Y+0+0001824382+0001825381.avro
...

everythings work fine in spark-shell or hive-shell

hive-site.xml from hive conf is copied in spark2/conf

using HDP 2.6.4.0-91 with spark 2.2

any help ?

1
Make sure you have copied hive-site.xml into /etc/spark2/conf directory cp /etc/hive/conf/hive-site.xml /etc/spark2/conf then restart spark2 service, Try to read the data from the table.Shu
Can you select from your table in spark-sql? or for instance in spark-shell spark.sql("select * from DB_X.table_Y limit 10")serge_k
also try spark.sql("show tables in DB_X")serge_k
@Shu hive-site.xml was already copied in spark2/conf and it workd from spark-shell and hive shell .maxime G
@serge_k yes everythings work fine from spark-shellmaxime G

1 Answers

0
votes

relocating the table using the HA name solve the problem.