1
votes

I am attempting to copy data into redshift from an S3 bucket, however I am getting a 1204 error code 'char length exceeds DDL length'.

copy table_name from '[data source]'
access_key_id '[access key]'
secret_access_key '[secret access key]'
region 'us-east-1'
null as 'NA'
delimiter ','
removequotes;

The error occurs in the very first row, where it tries to put the state abbreviation 'GA' into the data_state column which is defined with the data type char(2). When I query the stl_load_errors table I get the following result:

line_number  colname      col_length  type   raw_field_value  err_code  err_reason
1            data_state   2           char   GA               1204      Char length exceeds DDL length                                                                                                  

As far as I can tell that shouldn't exceed the length as it is two characters and it is set to char(2). Does anyone know what could be causing this?

3

3 Answers

0
votes

Got it to work by changing the data type to char(3) instead, however still not sure why char(2) wouldn't work

0
votes

Mine did this as well, for a state column too. Redshift defaults char to char(1) - so I had to specify char(2) - are you sure it didn't default back to char(1) because mine did

0
votes

Open the file up with a Hex editor, or use an online one here, and look at the GA value in the data_state column.

If it has three dots before it like so:

...GA

Then the file (or when it was orignally created) was UTF-8-BOM not just UTF-8.

You can open the file in something like Notepad++ and go to Encoding in the top bar then select Convert to UTF-8.