I have a spark dataframe. From which I have to read the column names , datatype and precision values if any.
I am able to read the name and type as below
for f in df.schema.fields:
name = f.name
type = f.dataType.typeName()
The dataframe schema looks like this-
[StructField(orgid,StringType,true), StructField(customerid,DecimalType(15,5),true), StructField(oppid,IntegerType,true)]
In the above schema I have to read the decimal precision value i.e., (15,5)
. Is there any way ?
Thank you for any help