3
votes

It is not really clear how to copy a CSV file that contains unix timestamp values into Postgres. For example, let's say that I have a CSV file that looks like this:

timestamp, column_1, column_2, column_3
1439935260;"A";"B";"C"
1439935261;"A";"B";"C"
1439935262;"A";"B";"C"

I also have a database which has the following columns:

timestamp    | timestamp without time zone | 
column_1     | character varying(40)       | 
column_2     | timestamp without time zone | 
column_3     | timestamp without time zone |

I would expect the command

\copy my_table FROM 'my_file.csv' DELIMITER ';' CSV HEADER;

to work, yet it doesn't. You would receive the error

ERROR: date/time field value out of range: "1439935260"

What is the proper way to copy CSV files with unix timestamp values into a table?

1
You can't copy that directly. You need to put it into a different table first, and the convert the "unix" value into a proper timestamp - a_horse_with_no_name
if it's not much data - you can create trigger before insert on that table, whose function would do NEW."timestamp" = to_timestamp(NEW."timestamp") if the purpose of that table is to be always used for such COPY imports - Vao Tsun
@VaoTsun: the trigger hack is not possible, since in this case the query is cancelled prior to the BEFORE INSERT stage. - Daniel Vérité

1 Answers

0
votes

Check out my answer on Database Administrators,

I suggest

  1. making the type on the table abstime if it's a new table.
  2. then running the ALTER statement on the table when you're done to change the type.