0
votes

What does dataframe saveAsTable(Append mode) do when target table has the different data type with current dataframe schema. for example:

> val df = Seq((1L,1),(2L,1),(3L,1),(456789234L,1)).toDF("i","p")
> df.printSchema
root
 |-- i: long (nullable = false)
 |-- p: integer (nullable = false)
> df.write.mode("append").format("hive").partitionBy("p").saveAsTable("default.ljh_test2")

and the existing hive table is created by:

CREATE TABLE default.ljh_test2 (i int) PARTITIONED BY (p int) STORED AS ORC;

it seems the dataframe could write the data to hive table, but the column i is bigint in dataframe but int int hive. I want to know how does the spark dataframe do the data type cast, where can I find he relevant docs.
Thanks.

1
spark version 2.3.5 - cceasy

1 Answers