1
votes

I am using Spark 2.1.0 and using Java SparkSession to run my SparkSQL. I am trying to save a Dataset<Row> named 'ds' to be saved into a Hive table named as schema_name.tbl_name using overwrite mode. But when I am running the below statement

ds.write().mode(SaveMode.Overwrite)
.option("header","true")
.option("truncate", "true")
.saveAsTable(ConfigurationUtils.getProperty(ConfigurationUtils.HIVE_TABLE_NAME));

the table is getting dropped after the first run. When I am rerunning it, the table is getting created with the data loaded.

Even using truncate option didn't resolve my issue. Does saveAsTable consider truncating the data instead of dropping/creating the table? If so, what is the correct way to do it in Java ?

1

1 Answers

0
votes

This is the reference to Apache JIRA for my question. Seems it is unresolved till now.

https://issues.apache.org/jira/browse/SPARK-21036