I have a timestamp in YYYY-MM-DD XX:XX:XX format in a csv file that is stored in S3 but when I use the timestamp data type to load into a Redshift database using Glue, the timestamp column is null. It appears that format is valid but I've also tried YYYYMMDD XXXXXX and YYMMDD XX:XX:XX formats as well just incase.
My mapping in Glue goes from timestamp to timestamp and the column datatype of the table is also timestamp. Ex of data in csv format:
1,2016 Summer,2016-06-22 00:00:00
Actual Output:
Line | Term | Date
-----+-------------+------------
1 | 2016 Summer |
Expected Output:
Line | Term | Date
-----+-------------+---------------------
1 | 2016 Summer | 2016-06-22 00:00:00
It seems that this should be a straightforward task but I can't get it right so if anyone else can find my mistake(s), that would be greatly appreciated.
Code:
val datasource37 = glueContext.getCatalogSource(database = "data", tableName = "term", redshiftTmpDir = "", transformationContext = "datasource37").getDynamicFrame()
val applymapping37 = datasource37.applyMapping(mappings = Seq(("id", "bigint", "id", "bigint"), ("name", "string", "name", "varchar(256)"), ("date", "timestamp", "date_start", "timestamp")), caseSensitive = false, transformationContext = "applymapping37")
val resolvechoice37 = applymapping37.resolveChoice(choiceOption = Some(ChoiceOption("make_cols")), transformationContext = "resolvechoice37")
val dropnullfields37 = resolvechoice37.dropNulls(transformationContext = "dropnullfields37")
val datasink37 = glueContext.getJDBCSink(catalogConnection = "dataConnection", options = JsonOptions("""{"dbtable": "term", "database": "data"}"""), redshiftTmpDir = args("TempDir"), transformationContext = "datasink37").writeDynamicFrame(dropnullfields37)