How to backup BigQuery in PostgreSQL

Galaxy Glossary

How do I back up BigQuery tables to Cloud Storage?

“Backing up” BigQuery means exporting tables or views to durable Cloud Storage so they can be re-imported if data is lost or corrupted.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.

Description

Why back up BigQuery data?

Backups protect against accidental deletes, failed updates, and schema changes. Keeping recent exports in Cloud Storage lets you reload data quickly without querying costly snapshots.

Which BigQuery command creates a backup?

Use EXPORT DATA. It writes the results of any SQL query—often SELECT * FROM dataset.table—to files in Cloud Storage.

How do I export a single ecommerce table?

Run EXPORT DATA with a URI that ends in a wildcard. BigQuery shreds output across multiple files when size exceeds 1 GB.

Syntax for EXPORT DATA

See the syntax block below for every option, including format, compression, and header control.

How can I back up several tables in one job?

Create a script that loops over table names and issues separate EXPORT DATA statements or wrap them in a multi-statement transaction in Dataform.

Can I automate backups?

Yes. Store each EXPORT DATA query inside a Cloud Workflow or Cloud Scheduler + Cloud Build job that runs daily. Use date macros like FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP()) in the URI path.

How do I restore a backup?

Load the files back with bq load or the SQL LOAD DATA command. Match the destination schema to avoid type coercion.

Best practices for reliable backups

• Use Avro or Parquet to preserve types
• Keep backups in a versioned bucket
• Enable lifecycle rules to delete objects older than 30 days
• Validate row counts after every export.

What mistakes should I avoid?

See the mistakes section after the syntax block.

Why How to backup BigQuery in PostgreSQL is important

How to backup BigQuery in PostgreSQL Example Usage


-- Daily incremental backup of new orders
EXPORT DATA OPTIONS (
    uri = CONCAT('gs://my_ecommerce_backups/Orders/date=', FORMAT_DATE('%Y-%m-%d', CURRENT_DATE()), '/Orders_*.parquet'),
    format = 'PARQUET',
    overwrite = false
) AS
SELECT *
FROM   `prod_dataset.Orders`
WHERE  order_date = CURRENT_DATE();

How to backup BigQuery in PostgreSQL Syntax


EXPORT DATA
OPTIONS (
    uri = 'gs://my_ecommerce_backups/Customers/Customers_*.parquet', -- destination path
    format = 'PARQUET',        -- PARQUET, CSV, NEWLINE_DELIMITED_JSON, AVRO
    overwrite = true,          -- replace if file exists
    header = true,             -- add header row (CSV only)
    field_delimiter = ',',     -- CSV delimiter
    compression = 'GZIP',      -- GZIP, NONE, SNAPPY (Parquet)
    use_avro_logical_types = true -- keep TIMESTAMP & DATE precision
) AS
SELECT * FROM `prod_dataset.Customers`;

-- Back up multiple tables
EXPORT DATA OPTIONS (
    uri='gs://my_ecommerce_backups/Orders/Orders_*.csv',
    format='CSV',
    overwrite=true,
    header=true,
    compression='GZIP'
) AS
SELECT * FROM `prod_dataset.Orders`;

Common Mistakes

Frequently Asked Questions (FAQs)

Is EXPORT DATA free?

BigQuery charges only for the bytes scanned in the SELECT part. The export itself is free.

Can I export views?

Yes, but you must set restrict_to_read_public_data=false if the view references external or public datasets.

What max size can I export?

Each export job can write up to 50 TB per destination URI pattern. Use wildcards to stay within limits.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie Logo
Bauhealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.