I have a table in hive whose DDL looks like this.
CREATE TABLE ABC(
name
string)
PARTITIONED BY (
col1
string,
col2
bigint,
col3
string,
col4
string)
I have a requirement in which I have to store the non partition column name of hive table into variable1 and partition column name into variable2 using spark scala.
Desired output would be :
variable1='name'
variable2='col1,col2,col3,col4'
I am following the below approach but not able to get the same.
val df=sql("desc default.ABC")
val df2=df.map(r => r.getString(0)).collect.toList
List[String] = List(name, col1, col2, col3, col4, # Partition Information, # col_name, col1, col2, col3, col4)
Can you please help me with the approach?