I have a DataFrame in PySpark with a column of URI query-string (StringType) like this:
+--------------+
| cs_uri_query |
+--------------+
| a=1&b=2&c=3 |
+--------------+
| d&e=&f=4 |
+--------------+
I need to convert this column in an ArrayType of StructField elements with the following structure:
ArrayType(StructType([StructField('key', StringType(), nullable=False),
StructField('value', StringType(), nullable=True)]))
My expected column is like this:
+------------------------------------------------------------+
| cs_uri_query |
+------------------------------------------------------------+
| [{key=a, value=1},{key=b, value=2},{key=c, value=3}] |
+------------------------------------------------------------+
| [{key=d, value=null},{key=e, value=null},{key=f, value=4}] |
+------------------------------------------------------------+
UDF is the only way i found to achieve this. I'm using pure Spark functions and, if it is possible, i would like to avoid UDFs... UDFs have very bad performance on PySpark, unlike using Spark on Scala lang.
This is my code using UDF:
def parse_query(query):
args = None
if query:
args = []
for arg in query.split("&"):
if arg:
if "=" in arg:
a = arg.split("=")
if a[0]:
v = a[1] if a[1] else None
args.append({"key": a[0], "value": v})
else:
args.append({"key": arg, "value": None})
return args
uri_query = ArrayType(StructType([StructField('key', StringType(), nullable=True),
StructField('value', StringType(), nullable=True)]))
udf_parse_query = udf(lambda args: parse_query(args), uri_query)
df = df.withColumn("cs_uri_query", udf_parse_query(df["cs_uri_query"]))
Someone able to open my eyes me with an amazing solution ?