How to Schedule Queries in BigQuery

Galaxy Glossary

How do I schedule recurring SQL queries in BigQuery?

BigQuery scheduled queries run a saved SQL statement automatically at a defined interval and store the results in a target table.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.

Description

What is a scheduled query in BigQuery?

A scheduled query is a saved SQL statement executed automatically on a recurring timetable by BigQuery Data Transfer Service. Results are written to a destination table without manual intervention.

Why should you schedule queries?

Automate ETL, maintain aggregate tables, and refresh dashboards. Scheduling reduces manual work, guarantees fresh data, and limits human error.

How do I create a scheduled query with the bq CLI?

Use bq mk --transfer_config with the scheduled_query data source. Provide the SQL, destination table template, schedule string, and dataset.

Which parameters are required?

--data_source=scheduled_query tells BigQuery to treat the job as a scheduled query. --params supplies JSON containing query, destination_table_name_template, and optional write_disposition. --schedule defines the frequency. --target_dataset selects where results land.

Can I schedule through the Cloud Console?

Yes. Write your SQL in the query editor, click Schedule, pick frequency, destination, and notification options, then save. BigQuery generates an equivalent transfer configuration behind the scenes.

How do I edit or delete a schedule?

Navigate to BigQuery ➜ Transfers. Select the transfer, then choose Edit to change SQL, frequency, or destination. Click Delete to stop the schedule permanently.

Best practices for scheduled queries

Use partitioned or clustered destination tables to control cost. Limit scheduled SQL to deterministic logic—avoid NOW() in SELECT without a WHERE clause. Grant minimum IAM roles (BigQuery Data Editor) to the transfer service account.

Common mistakes to avoid

Missing destination template: Omitting destination_table_name_template causes the schedule to fail. Always specify the table name or pattern.

Wrong schedule syntax: Format must be every 24 hours or every day 02:00. Typos break creation.

Key takeaways

Scheduled queries keep data fresh, lower toil, and are easy to manage through either the UI or CLI. Follow best practices and validate parameters to avoid failures.

Why How to Schedule Queries in BigQuery is important

How to Schedule Queries in BigQuery Example Usage


-- Daily customer spend per customer
bq mk --transfer_config \
  --display_name="daily_customer_spend" \
  --data_source=scheduled_query \
  --params='{
    "query": "SELECT customer_id, DATE(order_date) AS order_day, SUM(total_amount) AS spend FROM `shop.Orders` GROUP BY customer_id, order_day", 
    "destination_table_name_template": "customer_spend_daily",
    "write_disposition": "WRITE_TRUNCATE"
  }' \
  --schedule="every day 03:00" \
  --target_dataset=shop_analytics

How to Schedule Queries in BigQuery Syntax


bq mk --transfer_config \
  --display_name="<friendly_name>" \
  --data_source=scheduled_query \
  --params='{
    "query": "<SQL_statement>",
    "destination_table_name_template": "<table>_{run_date}",
    "write_disposition": "WRITE_TRUNCATE"
  }' \
  --schedule="every <interval>" \
  --target_dataset=<dataset>

Example using ecommerce tables:

bq mk --transfer_config \
  --display_name="daily_revenue" \
  --data_source=scheduled_query \
  --params='{
    "query": "SELECT DATE(order_date) AS order_day, SUM(total_amount) AS day_revenue FROM `shop.Orders` GROUP BY order_day", 
    "destination_table_name_template": "daily_revenue",
    "write_disposition": "WRITE_TRUNCATE"
  }' \
  --schedule="every 24 hours" \
  --target_dataset=shop_analytics

Common Mistakes

Frequently Asked Questions (FAQs)

Can I use parameters in a scheduled query?

No. Scheduled queries do not yet support BigQuery scripting parameters. Embed runtime logic such as {@run_date} placeholders instead.

How are scheduled query jobs billed?

Each execution is billed like a normal query based on bytes processed. Storage for the destination table is billed separately.

What IAM role is required to create schedules?

The user needs bigquery.admin or a combination of bigquery.user plus bigquery.transfers.update on the project and dataset.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie
BauHealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.