I am trying to move from spark 1.6 to 2.0, I get this error during compilation on 2.0 only:
def getSubGroupCount(df: DataFrame, colNames: String): Array[Seq[Any]] = {
val columns: Array[String] = colNames.split(',')
val subGroupCount: Array[Seq[Any]] = columns.map(c => df.select(c).distinct.map(x => x.get(0)).collect.toSeq)
subGroupCount
}
Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases. val subGroupCount: Array[Seq[Any]] = columns.map(c => df.select(c).distinct.map(x => x.get(0)).collect.toSeq)
Regards