pg_dump lets you export a single schema—including objects, data, and privileges—into a portable file.
Exporting a schema lets you back up structure and data, migrate between environments, or share a subset of a database without exposing everything.
Use pg_dump
. It ships with PostgreSQL and supports schema-only or full exports in plain SQL or compressed formats.
Run pg_dump
from the command line with -n
to target one schema and -f
to write to a file.Add -Fc
for a compressed, parallel-restoreable dump.
-h
– host-p
– port-U
– user-d
– database-n
– schema name-F
– format (c
=custom, p
=plain)-f
– output file--no-owner
– omit ownership statements--no-acl
– omit GRANT/REVOKEAssume a database shopdb
with schema ecommerce
holding Customers
, Orders
, Products
, and OrderItems
.
pg_dump -h localhost -U postgres -d shopdb \
-n ecommerce -F c -f ecommerce.dump
The file ecommerce.dump
contains tables, data, indexes, and constraints.Restore with pg_restore -d targetdb ecommerce.dump
.
pg_dump -h localhost -U postgres -d shopdb \
-n ecommerce --schema-only -f ecommerce_schema.sql
The resulting SQL recreates tables and other objects but omits rows.
Remove --no-owner
and --no-acl
so the dump records GRANT/REVOKE and ownership.Ensure the restore user has permission to apply them.
-F c
for faster, parallel restores.Mistake: pg_dump shopdb > db.sql
dumps the entire database. Fix: add -n ecommerce
.
Mistake: using --no-acl --no-owner
when you need roles.Fix: remove those switches or run pg_dumpall --roles-only
separately.
pg_restore
– load a custom dumppsql -f file.sql
– run a plain SQL dumpTest the dump by restoring to a staging database before overwriting production. Schedule regular exports with cron or CI pipelines.
.
No. pg_dump takes a consistent snapshot using MVCC, so reads continue uninterrupted. Long-running writes may bloat WAL but do not block.
Yes. Supply multiple -n options: -n ecommerce -n marketing
. Each schema goes into the same dump file.
Pipe the output: pg_dump -n ecommerce shopdb | gzip > ecommerce.sql.gz
.