1
votes

I have a Dataset with one column lastModified of type string with format "yyyy-MM-ddThh:mm:ss.SSS+0000" (sample data: 2018-08-17T19:58:46.000+0000).

I have to add a new column lastModif_mapped of type Timestamp by converting the lastModified's value to format "yyyy-MM-dd hh:mm:ss.SSS".

I tried the code below, but the new column is getting the value null in it:

Dataset<Row> filtered = null;
filtered = ds1.select(ds1.col("id"),ds1.col("lastmodified"))
                .withColumn("lastModif_mapped", functions.unix_timestamp(ds1.col("lastmodified"), "yyyy-MM-dd HH:mm:ss.SSS").cast("timestamp")).alias("lastModif_mapped");

Where am I going wrong?

1

1 Answers

2
votes
  1. As I have answered in your original question, your input data String field didn't correspond to allowed formats of the unix_timestamp(Column s, String p):

If a string, the data must be in a format that can be cast to a timestamp, such as yyyy-MM-dd or yyyy-MM-dd HH:mm:ss.SSSS

  1. For you case, you need to use to_timestamp(Column s, String fmt)
import static org.apache.spark.sql.functions.to_timestamp;
...
to_timestamp(ds1.col("lastmodified"), "yyyy-MM-dd'T'HH:mm:ss.SSSXXX")

And you don't need to cast explicitly to Timestamp since to_timestamp returns already Timestamp.

  1. When you use withColumn("lastModif_mapped",...) you don't need to add alias("lastModif_mapped"), because withColumn would create a new column with the provided name.