20
votes

I have this PySpark dataframe

+-----------+--------------------+
|uuid       |   test_123         |    
+-----------+--------------------+
|      1    |[test, test2, test3]|
|      2    |[test4, test, test6]|
|      3    |[test6, test9, t55o]|

and I want to convert the column test_123 to be like this:

+-----------+--------------------+
|uuid       |   test_123         |    
+-----------+--------------------+
|      1    |"test,test2,test3"  |
|      2    |"test4,test,test6"  |
|      3    |"test6,test9,t55o"  |

so from list to be string.

how can I do it with PySpark?

3

3 Answers

13
votes

You can create a udf that joins array/list and then apply it to the test column:

from pyspark.sql.functions import udf, col

join_udf = udf(lambda x: ",".join(x))
df.withColumn("test_123", join_udf(col("test_123"))).show()

+----+----------------+
|uuid|        test_123|
+----+----------------+
|   1|test,test2,test3|
|   2|test4,test,test6|
|   3|test6,test9,t55o|
+----+----------------+

The initial data frame is created from:

from pyspark.sql.types import StructType, StructField
schema = StructType([StructField("uuid",IntegerType(),True),StructField("test_123",ArrayType(StringType(),True),True)])
rdd = sc.parallelize([[1, ["test","test2","test3"]], [2, ["test4","test","test6"]],[3,["test6","test9","t55o"]]])
df = spark.createDataFrame(rdd, schema)

df.show()
+----+--------------------+
|uuid|            test_123|
+----+--------------------+
|   1|[test, test2, test3]|
|   2|[test4, test, test6]|
|   3|[test6, test9, t55o]|
+----+--------------------+
34
votes

While you can use a UserDefinedFunction it is very inefficient. Instead it is better to use concat_ws function:

from pyspark.sql.functions import concat_ws

df.withColumn("test_123", concat_ws(",", "test_123")).show()
+----+----------------+
|uuid|        test_123|
+----+----------------+
|   1|test,test2,test3|
|   2|test4,test,test6|
|   3|test6,test9,t55o|
+----+----------------+
0
votes

As of version 2.4.0, you can use array_join.Spark docs


from pyspark.sql.functions import array_join

df.withColumn("test_123", array_join("test_123", ",")).show()