0
votes

Good,

I'm working with the spark framework in Scala. My dataframe has a column with the following structure and content:

+---------------------------------------------------------------------------------------------+
|Email_Code                                                                                   |
+---------------------------------------------------------------------------------------------+
|[WrappedArray([3,spain]), WrappedArray([,]), WrappedArray([3,spain])]                        |
|[WrappedArray([3,spain]), WrappedArray([3,spain])]                                           |
+---------------------------------------------------------------------------------------------+

|-- Email_Code: array (nullable = true)
 |    |-- element: array (containsNull = false)
 |    |    |-- element: struct (containsNull = false)
 |    |    |    |-- Code: string (nullable = true)
 |    |    |    |-- Value: string (nullable = true)

And I am trying to develop a udf function that takes all the values ​​of the "Code" structure present in the array. But I'm not able ...

I would like an exit like the following:

+---------------------------------------------------------------------------------------------+
|Email_Code                                                                                   |
+---------------------------------------------------------------------------------------------+
|[3,,3]                                                                                       |
|[3,3]                                                                                        |
+---------------------------------------------------------------------------------------------+

Any help please?

1

1 Answers

0
votes

I got to fix it:

val transformation = udf((data: Seq[Seq[Row]]) => {data.flatMap(x => x).map{case Row(code:String, value:String) => code}})

df.withColumn("result", transformation($"columnName"))