Batch-uploading CSV files into Supabase means importing large, comma-separated datasets into your Supabase Postgres database in one operation using tools like the dashboard, CLI, psql, or SQL COPY.
Batch Uploading CSV Files into Supabase
Learn how to move data from flat files into your Supabase Postgres instance quickly and safely using the dashboard, CLI, psql, or SQL COPY—plus tips for avoiding common pitfalls.
Batch uploading is the process of importing an entire comma-separated values (CSV) file—sometimes millions of rows—into a database table in a single operation. Supabase, which exposes a fully-managed Postgres instance, supports several pathways for bulk import, each leveraging the underlying PostgreSQL COPY
command under the hood.
Modern data engineering pipelines frequently start with flat files exported from transactional systems, SaaS tools, or data providers. Getting that data into a relational database like Postgres enables efficient querying, joins, aggregation, and downstream analytics. A slow or error-prone import wastes engineering time and can stall projects reliant on fresh data.
Supabase gives you four main ways to ingest CSVs:
supabase db
) – scriptable and CI-friendly.psql
client – direct Postgres access for power users.COPY
– can be executed in any SQL editor (e.g., Galaxy).Either design the schema manually or let Supabase infer it with the Table Editor. For large files, pre-creating the table avoids type-guessing surprises.
CREATE TABLE public.orders (
order_id uuid PRIMARY KEY,
customer_id uuid NOT NULL,
order_date timestamptz,
total_amount numeric(12,2)
);
Limitations: file size capped at 50 MB, no transactional guarantees, fewer tuning knobs.
# Install once: npm i -g supabase
supabase login --project-ref YOUR_PROJECT_REF
supabase db remote set --db-url "postgres://user:pass@db.YOUR_PROJECT.supabase.co:5432/postgres"
# Run import
supabase db dump --data-only --file orders.csv \
--table orders --schema public
The CLI streams the file through the Postgres COPY
API, providing progress feedback and retry logic—ideal for CI/CD pipelines.
COPY
(Maximum Control)Connect directly to the database URL (available in Project Settings → Database). Then:
psql "postgres://user:pass@db.YOUR_PROJECT.supabase.co:5432/postgres" -c \
"\copy public.orders FROM '/absolute/path/orders.csv' CSV HEADER"
\copy
streams from your local machine to the server. Use standard COPY
options like DELIMITER
, NULL
, QUOTE
, or ENCODING
.
COPY
(Works in Galaxy)If you prefer a graphical SQL editor such as Galaxy, upload the file to a storage bucket or make it publicly accessible, then run:
-- File is stored in Supabase Storage at the signed URL below
COPY public.orders (order_id, customer_id, order_date, total_amount)
FROM 'https://YOUR_PROJECT.supabase.co/storage/v1/object/public/imports/orders.csv'
WITH (FORMAT csv, HEADER true);
This lets you keep the entire workflow in SQL—helpful for reproducibility and code review.
ALTER TABLE ... DISABLE TRIGGER ALL;
TEMP
tables followed by an INSERT ... SELECT
for type coercion or deduplication.COPY
inside a transaction: BEGIN; COPY ...; COMMIT;
statement_timeout
in the session when copying millions of rows.MM/DD/YY
) fail silently into NULL
. Clean them upfront.QUOTE '"'
and ESCAPE '"'
or pre-clean.pg_stat_progress_copy
(requires Postgres 14+).COPY
scripts alongside schema migrations.Because Galaxy is a Postgres-compatible SQL editor, you can point it at your Supabase connection string, paste the COPY ... FROM
statement, and execute. Galaxy’s AI Copilot will even suggest the correct column list or generate validation queries post-import. Store the script in a Collection, endorse it, and share with your team—no more Slack paste.
Whether you prefer a point-and-click interface or pure SQL, Supabase offers a flexible toolchain for batch-loading CSV data. Start small, validate often, and automate once the process is repeatable. With clean data and proper tuning, you can move millions of rows in minutes.
Bulk importing flat-file data is a foundational task for data engineers: onboarding customer data, seeding analytics tables, or backfilling historical records. A smooth Supabase import workflow shortens time-to-insight, reduces operational risk, and keeps analytics teams unblocked.
About 50 MB. For anything larger, use the CLI, psql
, or COPY
in a SQL editor.
Yes. The Supabase CLI can be executed in GitHub Actions or any CI runner to pull the CSV from storage and run the import script.
Load into a temporary table first, then INSERT ... ON CONFLICT
into the target table, or use the ON CONFLICT
clause directly if your table has unique constraints.
Absolutely. Connect Galaxy to your Supabase database URL, paste the COPY
statement, and execute. Galaxy’s AI Copilot can even suggest column mappings and generate post-import validation queries.