Batch Uploading CSV Files into Supabase: A Complete Guide

Galaxy Glossary

How do I batch-upload CSV files into Supabase?

Batch-uploading CSV files into Supabase means importing large, comma-separated datasets into your Supabase Postgres database in one operation using tools like the dashboard, CLI, psql, or SQL COPY.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Description

Batch Uploading CSV Files into Supabase

Learn how to move data from flat files into your Supabase Postgres instance quickly and safely using the dashboard, CLI, psql, or SQL COPY—plus tips for avoiding common pitfalls.

What Is Batch Uploading of CSV Files?

Batch uploading is the process of importing an entire comma-separated values (CSV) file—sometimes millions of rows—into a database table in a single operation. Supabase, which exposes a fully-managed Postgres instance, supports several pathways for bulk import, each leveraging the underlying PostgreSQL COPY command under the hood.

Why Does It Matter?

Modern data engineering pipelines frequently start with flat files exported from transactional systems, SaaS tools, or data providers. Getting that data into a relational database like Postgres enables efficient querying, joins, aggregation, and downstream analytics. A slow or error-prone import wastes engineering time and can stall projects reliant on fresh data.

Supported Upload Methods

Supabase gives you four main ways to ingest CSVs:

  1. Table Editor UI – drag-and-drop for smaller files (<50 MB).
  2. Supabase CLI (supabase db) – scriptable and CI-friendly.
  3. psql client – direct Postgres access for power users.
  4. Pure SQL COPY – can be executed in any SQL editor (e.g., Galaxy).

Step-by-Step Instructions

1. Prepare Your CSV

  • Save in UTF-8 encoding.
  • Include a header row that exactly matches column names or prepare a mapping list.
  • Normalize time zones and date formats (ISO-8601 recommended).
  • Remove formula columns or stray delimiters.

2. Create or Verify the Target Table

Either design the schema manually or let Supabase infer it with the Table Editor. For large files, pre-creating the table avoids type-guessing surprises.

CREATE TABLE public.orders (
order_id uuid PRIMARY KEY,
customer_id uuid NOT NULL,
order_date timestamptz,
total_amount numeric(12,2)
);

3. Dashboard Drag-and-Drop (Quick <50 MB)

  1. Open Supabase project → Database → Table Editor.
  2. Select your table or click “New table.”
  3. Click “Import Data,” choose the CSV, and map columns.
  4. Monitor the status in the “Imports” tab.

Limitations: file size capped at 50 MB, no transactional guarantees, fewer tuning knobs.

4. Supabase CLI Import (Automatable)

# Install once: npm i -g supabase
supabase login --project-ref YOUR_PROJECT_REF
supabase db remote set --db-url "postgres://user:pass@db.YOUR_PROJECT.supabase.co:5432/postgres"

# Run import
supabase db dump --data-only --file orders.csv \
--table orders --schema public

The CLI streams the file through the Postgres COPY API, providing progress feedback and retry logic—ideal for CI/CD pipelines.

5. psql COPY (Maximum Control)

Connect directly to the database URL (available in Project Settings → Database). Then:

psql "postgres://user:pass@db.YOUR_PROJECT.supabase.co:5432/postgres" -c \
"\copy public.orders FROM '/absolute/path/orders.csv' CSV HEADER"

\copy streams from your local machine to the server. Use standard COPY options like DELIMITER, NULL, QUOTE, or ENCODING.

6. Pure SQL COPY (Works in Galaxy)

If you prefer a graphical SQL editor such as Galaxy, upload the file to a storage bucket or make it publicly accessible, then run:

-- File is stored in Supabase Storage at the signed URL below
COPY public.orders (order_id, customer_id, order_date, total_amount)
FROM 'https://YOUR_PROJECT.supabase.co/storage/v1/object/public/imports/orders.csv'
WITH (FORMAT csv, HEADER true);

This lets you keep the entire workflow in SQL—helpful for reproducibility and code review.

Performance & Reliability Tips

  • Disable triggers and indexes during large loads: ALTER TABLE ... DISABLE TRIGGER ALL;
  • Use TEMP tables followed by an INSERT ... SELECT for type coercion or deduplication.
  • Wrap the COPY inside a transaction: BEGIN; COPY ...; COMMIT;
  • Split huge files into 100–500 MB chunks and run imports in parallel.
  • Increase statement_timeout in the session when copying millions of rows.

Common Pitfalls

  • Mismatched headers. Postgres maps by column position, not name. Ensure the CSV order matches the table or list columns explicitly.
  • Bad date formats. Non-ISO formats (e.g., MM/DD/YY) fail silently into NULL. Clean them upfront.
  • Escaped quotes. Excel often doubles quotes; set QUOTE '"' and ESCAPE '"' or pre-clean.

Best Practices Checklist

  1. Create the table first with correct data types.
  2. Run a small sample import to validate.
  3. Automate via Supabase CLI or CI pipeline.
  4. Monitor with pg_stat_progress_copy (requires Postgres 14+).
  5. Version-control your COPY scripts alongside schema migrations.

Galaxy Workflow (Optional)

Because Galaxy is a Postgres-compatible SQL editor, you can point it at your Supabase connection string, paste the COPY ... FROM statement, and execute. Galaxy’s AI Copilot will even suggest the correct column list or generate validation queries post-import. Store the script in a Collection, endorse it, and share with your team—no more Slack paste.

Conclusion

Whether you prefer a point-and-click interface or pure SQL, Supabase offers a flexible toolchain for batch-loading CSV data. Start small, validate often, and automate once the process is repeatable. With clean data and proper tuning, you can move millions of rows in minutes.

Why Batch Uploading CSV Files into Supabase: A Complete Guide is important

Bulk importing flat-file data is a foundational task for data engineers: onboarding customer data, seeding analytics tables, or backfilling historical records. A smooth Supabase import workflow shortens time-to-insight, reduces operational risk, and keeps analytics teams unblocked.

Batch Uploading CSV Files into Supabase: A Complete Guide Example Usage


COPY public.orders FROM 'https://YOUR_PROJECT.supabase.co/storage/v1/object/public/imports/orders.csv' WITH (FORMAT csv, HEADER true);

Common Mistakes

Frequently Asked Questions (FAQs)

What is the maximum file size I can upload through the Supabase UI?

About 50 MB. For anything larger, use the CLI, psql, or COPY in a SQL editor.

Can I automate imports in a CI pipeline?

Yes. The Supabase CLI can be executed in GitHub Actions or any CI runner to pull the CSV from storage and run the import script.

How do I handle duplicate rows?

Load into a temporary table first, then INSERT ... ON CONFLICT into the target table, or use the ON CONFLICT clause directly if your table has unique constraints.

Can I use Galaxy to run COPY commands against Supabase?

Absolutely. Connect Galaxy to your Supabase database URL, paste the COPY statement, and execute. Galaxy’s AI Copilot can even suggest column mappings and generate post-import validation queries.

Want to learn about other SQL terms?