0
votes

I would like to copy the data from local csv file to sql server in Azure Data Factory. The table in sql server is created already. The local csv file is exported from mysql.

When I use copy data in Azure Data Factory, there is an error "Exception occurred when converting value 'NULL' for column name 'deleted' from type 'String' to type 'DateTime'. The string was not recognized as a valid DateTime.

What I have done:

  1. I checked the original value from column name 'deleted' is NULL, without quotes(i.e. not 'NULL').

enter image description here

  1. I cannot change the data type during file format settings. The data type for all column is preset to string as default.

enter image description here enter image description here

  1. I tried to create data flow instead of copy data. I can change the data type from source projection. But the sink dataset cannot select sql server.

What can I do to copy data from CSV file to sql server via Azure Data Factory?

1
Can you show us any data value of delete column?Leon Yue
Hi @Wong Benny, If my answer is helpful for you, hope you can accept it as answer( click on the check mark beside the answer to toggle it from greyed out to filled in.). This can be beneficial to other community members. Thank you.Leon Yue

1 Answers

0
votes

Data Flow doesn't support on-premise SQL. We can't create the source and sink.

You can use copy active or copy data tool to do that. I made an example data which delete is NULL: enter image description here enter image description here

As you said the delete column data is Null or contains NULL, and ALL will be considered as String. The key is that your Sink SQL Server table schema if it allows NULL. enter image description here

I tested many times and it all works well. enter image description here