I am trying to include the spark-avro package while starting spark-shell, as per the instructions mentioned here: https://github.com/databricks/spark-avro#with-spark-shell-or-spark-submit.
spark-shell --packages com.databricks:spark-avro_2.10:2.0.1
My intent is to convert the avro schema to spark schema type, using SchemaConverter class present in the package.
import com.databricks.spark.avro._ ... //colListDel is list of fields from avsc which are to be delted for some functional reason.
for( field <- colListDel){
println(SchemaConverters.toSqlType(field.schema()).dataType);
}
...
On execution of above for loop, i get below error:
<console>:47: error: object SchemaConverters in package avro cannot be accessed in package com.databricks.spark.avro
println(SchemaConverters.toSqlType(field.schema()).dataType);
Please suggest if there is anything I am missing or let me know how to include SchemaConverter in my scala code.
Below are my envt details: Spark version: 1.6.0 Cloudera VM 5.7
Thanks!
val sField = new StructField(f.name,SchemaConverters.toSqlType(f.schema()).dataType,false)
and I found the belowerror: Spark symbol SchemaConverters is not accessible from this place
did you find a solution for older versions? i am limited to version 1.4.1 in my workplace – alsolh