I have already imported spark.implicits._
But still I get error
Error:(27, 33) Unable to find encoder for type stored in a Dataset. Primitive types (Int, String, etc) and Product types (case classes) are supported by importing spark.implicits._ Support for serializing other types will be added in future releases.
I have a case class like:
case class User(name: String, dept: String)
and I am converting Dataframe to dataset using:
val ds = df.map { row=> User(row.getString(0), row.getString(1) }
or
val ds = df.as[User]
Also, when I am trying the same code in Spark-shell
I get no error, only when I run it through IntelliJ or submit the job I get this error.
Any reasons why?