We have fifteen embedded newline characters in the field of a source S3 file. The field size in target table in Redshift is VARCHAR(5096)
. The field length in the source file is 5089
bytes. We are escaping each of the fifteen newline characters with a backslash \
as required by the ESCAPE option of the copy
command. Our expectation with the ESCAPE option is that the backslash \
that has been inserted by us before each newline character will be ignored before loading the target in Redshift. However, when we use copy
command with the ESCAPE option we are getting
err_code:1204 - String length exceeds DDL length."
Is there a way in which the added backslash \
characters are not counted for target column loads in Redshift?
Note: When we truncated the above source field in the file to 4000
bytes and inserted the backslash \
before the newline characters, the copy
command with ESCAPE option successfully loaded the field in Redshift. Also, the backslash \
characters were not loaded in Redshift as expected.