I have ADF pipelines exporting (via copy activity) data from Azure SQL db to Data Lake (ADLS2) then from there to another Azure SQL db. It was working fine until some characters appeared.
This is how the culprit record looks in the first Azure SQL db: "Gasunie\
This is how the dataset is set up in ADF to export it into ADLS: Column delimiter - pipe Row delimiter - autodetect Encoding - Default(UTF-8) Escape character - Backslash() Quote character - Double quote (")
This is how the exported file looks like in notepad++ (it's pipe-delimited file): "\"Gasunie\" enter image description here
These are the settings for the adls dataset in ADF when loading it from adls to azure sql db: Column delimiter - comma Row delimiter - autodetect Encoding - Default(UTF-8) Escape character - Backslash() Quote character - Double quote (")
Note it's comma-delimited now but that is not causing any problems.
But this is how it looks once loaded: "Gasunie"|1|||||||||||... The backslash that was originally there has somehow caused it to stop the delimiting for the next few columns. enter image description here
I have tried many, many various different settings for the quotes and escape character but they create more problems for other data in the dataset.
Does anyone know how I can correct it without having to ask for the source to be corrected?
Note: there is a reason why it's stored in adls so it can't be a copy from Azure sql db to another Azure Sql db.
It's built from a template by consultants, highly parameterized so inserting dataflows to process the files in adls would be a very lengthy process.
Any help appreciated. Thank you.


