I want to create a simple azure data factory process to read a file (csv) from blob storage and write it to an azure sql database using data flows.
the source dataset has a column with the name "myTime" of type "string".
I added a "derived column" to add a new column with the name "customTime" with the expression "currentTimestamp()"
finally, in the sql sink, I mapped "customTime" to my DateTime field on the database.
If I preview the data in data flows, everything looks alright, I can see both fields (myTime, customTime). When I debug the pipeline, I get the following exception:
Activity myActivity failed: DF-SYS-01 at Sink 'sqlsink': java.sql.BatchUpdateException: Invalid column name 'myTime'
Any idea why the sql sink is linked to "myTime" and not "customTime"? I don't see any reference to "myTime" except that this is part of the input schema.
Thank you very much and best regards Michael