0
votes

I'm using COPY INTO to bulk insert some data into MonetDB. The csv that I'm copying from has some special characters like \ in some string fields and it broke the insertion. The error that I got is:

Failed to import table line 1 field 11 'clob' expected in 'data here'

and when I replace \ with / it is working fine, I think the problem related to Java because \ is using as escape character but I'm not sure!

this is the command I used for bulk load :

COPY INTO line : copy into dbtest.dbo.table3 from '/home/etltest/out.csv' using delimiters ';','\n','"';

and this is a sample row :

"452475";"0047748";"002";"2014-01-01 00:16:00.000";"2";"2014-01-01 00:16:16.090";"1";"1";"0";"testfile.xml";"~/uploads/Su/F0047748\2014\1";"3d67502‌​e-94ed-4e3d";"2014-01-01 00:15:25.283" 

I found a work around which use REPLACE function in the SQL to replace \ with / but this is too heavy processing function when handling millions of rows

Why is this happening and is there any way rather than replace the \ ?

1
could you please provide a minimal example with some of your data and the COPY INTO line you used? It should be possible to work around this.Hannes Mühleisen
I'm using this COPY INTO line : copy into dhpo.dbo.submissiontransactions3 from '/home/etltest/out.csv' using delimiters ';','\n','"';Saed Hammad
and sample line ::::: "452475";"0047748";"002";"2014-01-01 00:16:00.000";"2";"2014-01-01 00:16:16.090";"1";"1";"0";"testfile.xml";"~/uploads/Su/F0047748\2014\1";"3d67502e-94ed-4e3d";"2014-01-01 00:15:25.283"Saed Hammad
when I replace this part "~/uploads/Su/F0047748\2014\1" to "~/uploads/Su/F0047748/2014/1" it is working fine and the bulk insert working fineSaed Hammad
hi @SaedHammad please edit your question rather than writing in comments. please give a sample line the provokes the error. the text data here is nowhere to be found within your sample line..thanksAnthony Damico

1 Answers

0
votes

You're right, the '\' is an escape character. To fix the import error, you need to double it. Replace all '\' occurences in your csv file with '\'.

Doing this with REPLACE SQL function will be costly. If possible, double the '\'in your CSV file directly, or change the tool that generates the CSV file to take care of this.