2
votes

Is it possible to explode multiple columns into one new column in spark? I have a dataframe which looks like this:

userId         varA     varB
   1         [0,2,5]   [1,2,9]

desired output:

userId     bothVars
   1         0     
   1         2     
   1         5  
   1         1  
   1         2  
   1         9  

What I have tried so far:

val explodedDf = df.withColumn("bothVars", explode($"varA")).drop("varA")
                   .withColumn("bothVars", explode($"varB")).drop("varB")

which doesn't work. Any suggestions is much appreciated.

2

2 Answers

3
votes

You could wrap the two arrays into one and flatten the nested array before exploding it, as shown below:

val df = Seq(
  (1, Seq(0, 2, 5), Seq(1, 2, 9)),
  (2, Seq(1, 3, 4), Seq(2, 3, 8))
).toDF("userId", "varA", "varB")

df.
  select($"userId", explode(flatten(array($"varA", $"varB"))).as("bothVars")).
  show
// +------+--------+
// |userId|bothVars|
// +------+--------+
// |     1|       0|
// |     1|       2|
// |     1|       5|
// |     1|       1|
// |     1|       2|
// |     1|       9|
// |     2|       1|
// |     2|       3|
// |     2|       4|
// |     2|       2|
// |     2|       3|
// |     2|       8|
// +------+--------+

Note that flatten is available on Spark 2.4+.

1
votes

Use array_union and then use explode function.

scala> df.show(false)
+------+---------+---------+
|userId|varA     |varB     |
+------+---------+---------+
|1     |[0, 2, 5]|[1, 2, 9]|
|2     |[1, 3, 4]|[2, 3, 8]|
+------+---------+---------+
scala> df
.select($"userId",explode(array_union($"varA",$"varB")).as("bothVars"))
.show(false)

+------+--------+
|userId|bothVars|
+------+--------+
|1     |0       |
|1     |2       |
|1     |5       |
|1     |1       |
|1     |9       |
|2     |1       |
|2     |3       |
|2     |4       |
|2     |2       |
|2     |8       |
+------+--------+

array_union is available in Spark 2.4+