I have a decimal column "TOT_AMT" defined as type "bytes" and logical type "decimal" in my avro schema.
After creating the data frame in spark using databricks spark-avro, when I tried to sum the TOT_AMT column using the sum function it throws "Function sum requires numeric types not Binarytype" error.
The column is defined like below in avro schema,
name="TOT_AMT","type":["null",{ "type":"bytes","logicaltype":"decimal","precision":20,"scale":10}]
I am creating dataframe and summing up like,
val df=sqlContext.read.format("com.databricks.spark.avro").load("input dir")
df.agg(sum("TOT_AMT")).show()
It seems that the decimal value is read as Binarytype while creating dataframe. In such a case how can we perform numeric operations on such decimal columns? Will it be possible to convert this Byte array to BigDecimal and then perform calculations.