28
votes

I need to export a big table to csv file and compress it.

I can export it using COPY command from postgres like -

COPY foo_table to '/tmp/foo_table.csv' delimiters',' CSV HEADER;

And then can compress it using gzip like -

gzip -c foo_table.csv > foo.gz

The problem with this approach is, I need to create this intermediate csv file, which itself is huge, before I get my final compressed file.

Is there a way of export table in csv and compressing the file in one step?

Regards, Sujit

3
If it doesn't necessarily have to be CSV, you could use pg_dump, as in: pg_dump -Z 5Joey Adams

3 Answers

53
votes

The trick is to make COPY send its output to stdout, then pipe the output through gzip:

psql -c "COPY foo_table TO stdout DELIMITER ',' CSV HEADER" \
    | gzip > foo_table.csv.gz
9
votes

Expanding a bit on @Joey's answer, below adds support for a couple more features available in the manual.

psql -c "COPY \"Foo_table\" (column1, column2) TO stdout DELIMITER ',' CSV HEADER" \
    | gzip > foo_table.csv.gz

If you have capital letters in your table name (woe be onto you), you need the \" before and after the table name.

The second thing I've added is column listing.

Also note from the docs:

This operation is not as efficient as the SQL COPY command because all data must pass through the client/server connection. For large amounts of data the SQL command might be preferable.

7
votes

You can use directly, as per docs, https://www.postgresql.org/docs/9.4/sql-copy.html

COPY foo_table to PROGRAM 'gzip > /tmp/foo_table.csv' delimiters',' CSV HEADER;