“Backing up” BigQuery means exporting tables or views to durable Cloud Storage so they can be re-imported if data is lost or corrupted.
Backups protect against accidental deletes, failed updates, and schema changes. Keeping recent exports in Cloud Storage lets you reload data quickly without querying costly snapshots.
Use EXPORT DATA
. It writes the results of any SQL query—often SELECT * FROM dataset.table
—to files in Cloud Storage.
Run EXPORT DATA
with a URI that ends in a wildcard. BigQuery shreds output across multiple files when size exceeds 1 GB.
See the syntax block below for every option, including format, compression, and header control.
Create a script that loops over table names and issues separate EXPORT DATA
statements or wrap them in a multi-statement transaction in Dataform.
Yes. Store each EXPORT DATA
query inside a Cloud Workflow or Cloud Scheduler + Cloud Build job that runs daily. Use date macros like FORMAT_TIMESTAMP('%Y%m%d', CURRENT_TIMESTAMP())
in the URI path.
Load the files back with bq load
or the SQL LOAD DATA
command. Match the destination schema to avoid type coercion.
• Use Avro or Parquet to preserve types
• Keep backups in a versioned bucket
• Enable lifecycle rules to delete objects older than 30 days
• Validate row counts after every export.
See the mistakes section after the syntax block.
BigQuery charges only for the bytes scanned in the SELECT part. The export itself is free.
Yes, but you must set restrict_to_read_public_data=false
if the view references external or public datasets.
Each export job can write up to 50 TB per destination URI pattern. Use wildcards to stay within limits.