0
votes

The vectorAssembler function in spark gives a vector[double] type as output, but i need to convert that to array[double]. I know that there is an inbuild Vector_to_array function provided but i am not getting how to convert the column to array some of the elements are sparse array as well.

var assembler = new VectorAssembler().setInputCols(Array("Pclass",
  "Age",
  "Fare",
  "Gender",
  "Boarded"
)).setOutputCol("features")
var transformedDF =  assembler.setHandleInvalid("skip").transform(updatedDF)

this is the code and i need to convert the features column from vector to array type.

enter image description here

1

1 Answers

1
votes

Use alias for Vector depending on ml or mllib library, to not get confused with Scala's Vector

import org.apache.spark.ml.linalg.Vector => MLVector
import org.apache.spark.mllib.linalg.Vector => MLlibVector

Then create a UDF

// 1. function to map to Array[Double]
val vec2Array: Any => Array[Double] = {
  vector => vector match {
    case vec: MLVector => vec.toArray
    case vec: MLlibVector => vec.toArray
  }
}
// 2. User Defined Function (UDF)
val vec2ArrayUDF = udf(vec2Array)
// 3. apply UDF on "features" column
val df_vec2Array = transformedDF.withColumn("features_array",vec2ArrayUDF(col("features")))