Background: I run end-to-end tests that depend on a PostgreSQL database. I need these tests to be fast, parallelizable, deterministic, and isolated.
Therefore, I want to repeatedly recreate the database -- including schema, data, and indexes -- as quickly as possible. I create a base image:
- start a PostgreSQL instance
- run SQL statements
- run
VACCUM FULL
- stop the instance
- tar the files in
/var/lib/postgresql/data
For each test I then quickly untar that image and run a PostgreSQL instance.
Problem: This all works, but the disk files seem larger than necessary. A rather small database is still 64MB in size.
How can I achieve a smaller set of files for file-level restores? Shrink the size of the existing files? Exclude some files from the backup?
template0
andpostgres
. It won't get much smaller than that. – Laurenz Albe