Use COPY or \copy to bulk-load CSV data directly into a PostgreSQL table.
COPY table FROM 'path' WITH (FORMAT csv);<\/code> or the client-side \copy table FROM 'path' WITH (FORMAT csv);<\/code>. Both parse the file and insert rows efficiently.<\/p>
When should I choose COPY vs. \copy?<\/h2>
Use COPY<\/code> when the server can reach the file, e.g., on the same host or mounted storage. Use \copy<\/code> from psql<\/code> when the file resides on your workstation or you lack superuser rights.<\/p>
What does a minimal command look like?<\/h2>
\copy sales FROM 'sales.csv' WITH (FORMAT csv, HEADER true);<\/code> reads
sales.csv<\/em>, skips the header row, and writes to the sales<\/em> table using the default column order.<\/p>
How do I map columns or change the delimiter?<\/h2>
List target columns after the table name and set DELIMITER<\/code>. Example: COPY sales(id, amount) FROM 's.csv' WITH (FORMAT csv, DELIMITER ';');<\/code>.<\/p>
How can I import only a subset of columns?<\/h2>
Create a staging table matching the CSV layout, import into it, then INSERT ... SELECT<\/code> the columns you need into the final table. This isolates bad rows and mismatched data types.<\/p>
What are best practices for large CSV loads?<\/h2>Disable indexes, constraints, and triggers during the load; re-enable them afterward. Increase maintenance_work_mem<\/code>, wrap operations in a transaction, and monitor progress with \watch pg_stat_progress_copy;<\/code>.<\/p>
gunzip -c file.gz | psql -c "\copy ... FROM STDIN WITH (FORMAT csv)"<\/code>.<\/p>
How do I skip bad rows?<\/h3>
Import into a staging table with all columns as text, then use INSERT ... SELECT
with casts inside TRY ... CATCH
-like logic in PL/pgSQL or validate in the client.<\/p>