How to BULK INSERT in PostgreSQL

Galaxy Glossary

How do I bulk load data into PostgreSQL like SQL Server BULK INSERT?

Use the COPY command to load large external files directly into a PostgreSQL table with minimal logging and maximal speed.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
You'll be receiving a confirmation email

Follow us on twitter :)
Oops! Something went wrong while submitting the form.

Description

Table of Contents

Why use COPY instead of INSERT?

INSERT runs one row at a time and writes full WAL; COPY streams data straight to disk, batching commits and cutting write-ahead logging. Expect 10-100× speed-ups for million-row loads.

What permissions are required?

You need superuser or the role with the pg_read_server_files privilege to read server-side files. For \copy from a client workstation, ordinary INSERT rights on the target table are enough.

Which file formats can I load?

COPY supports CSV, text (delimited), binary, and program output via STDIN. CSV is safest for cross-platform workflows and supports NULL & quote handling.

How do I bulk load a CSV into Products?

1. Place products.csv on the server.
2. Run the COPY command shown below.
3. Verify row count with SELECT COUNT(*) FROM Products;

How do I handle dates and numbers?

Format your CSV columns exactly as PostgreSQL expects (ISO-8601 dates, unformatted numbers). Alternatively, load into a staging table of text columns, then cast into the final schema.

What options improve performance?

Disable indexes and foreign keys until after the load, set maintenance_work_mem high, and use ALTER TABLE ... SET UNLOGGED for transient staging tables.

Step-by-step COPY FROM example

COPY Products(id, name, price, stock)FROM '/var/lib/postgresql/import/products.csv'WITH (FORMAT csv, HEADER true, DELIMITER ',', QUOTE '"');

How do I load from my laptop?

Use \copy in psql. The client reads the file and streams it to the server, bypassing the need for server-side file access.

What errors should I watch for?

Typical failures include bad delimiters, wrong column counts, or data-type mismatches. Load a sample with LIMIT first, or add LOG ERRORS (pg-exttable) to trap bad rows.

Best practices checklist

• Validate the file schema.
• Load into staging.
• Analyze the table after load.
• Re-enable constraints and indexes.
• Vacuum if you used UNLOGGED.

Why How to BULK INSERT in PostgreSQL is important

How to BULK INSERT in PostgreSQL Example Usage


-- Load 100,000 new products in one shot
COPY Products(id, name, price, stock)
FROM '/var/lib/postgresql/import/products.csv'
WITH (FORMAT csv, HEADER true);

-- Verify
SELECT COUNT(*) FROM Products;

How to BULK INSERT in PostgreSQL Syntax


COPY table_name [( column_list )]
FROM { 'filename' | PROGRAM 'command' | STDIN }
[ WITH ( 
    FORMAT { csv | text | binary },
    DELIMITER 'character',
    NULL 'string',
    HEADER { true | false },
    QUOTE 'character',
    ESCAPE 'character',
    FREEZE { true | false }
) ];

Example loading a CSV into Orders:
COPY Orders(id, customer_id, order_date, total_amount)
FROM '/import/orders_2023.csv'
WITH (FORMAT csv, HEADER true, DELIMITER ',', QUOTE '"');

Common Mistakes

Frequently Asked Questions (FAQs)

Can I commit in batches?

No. COPY is one transaction. Use smaller files or pg_copy wrappers if you need chunked commits.

How do I skip the header row?

Add HEADER true in the WITH clause when loading CSV files.

Is COPY safe in WAL-archived clusters?

Yes, but consider SET session_replication_role = replica offloading if you are replicating logical changes elsewhere.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie Logo
Bauhealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
You'll be receiving a confirmation email

Follow us on twitter :)
Oops! Something went wrong while submitting the form.