BigQuery scheduled queries run a saved SQL statement automatically at a defined interval and store the results in a target table.
A scheduled query is a saved SQL statement executed automatically on a recurring timetable by BigQuery Data Transfer Service. Results are written to a destination table without manual intervention.
Automate ETL, maintain aggregate tables, and refresh dashboards. Scheduling reduces manual work, guarantees fresh data, and limits human error.
Use bq mk --transfer_config
with the scheduled_query
data source. Provide the SQL, destination table template, schedule string, and dataset.
--data_source=scheduled_query
tells BigQuery to treat the job as a scheduled query. --params
supplies JSON containing query
, destination_table_name_template
, and optional write_disposition
. --schedule
defines the frequency. --target_dataset
selects where results land.
Yes. Write your SQL in the query editor, click Schedule, pick frequency, destination, and notification options, then save. BigQuery generates an equivalent transfer configuration behind the scenes.
Navigate to BigQuery ➜ Transfers. Select the transfer, then choose Edit to change SQL, frequency, or destination. Click Delete to stop the schedule permanently.
Use partitioned or clustered destination tables to control cost. Limit scheduled SQL to deterministic logic—avoid NOW()
in SELECT without a WHERE clause. Grant minimum IAM roles (BigQuery Data Editor) to the transfer service account.
Missing destination template: Omitting destination_table_name_template
causes the schedule to fail. Always specify the table name or pattern.
Wrong schedule syntax: Format must be every 24 hours
or every day 02:00
. Typos break creation.
Scheduled queries keep data fresh, lower toil, and are easy to manage through either the UI or CLI. Follow best practices and validate parameters to avoid failures.
No. Scheduled queries do not yet support BigQuery scripting parameters. Embed runtime logic such as {@run_date}
placeholders instead.
Each execution is billed like a normal query based on bytes processed. Storage for the destination table is billed separately.
The user needs bigquery.admin
or a combination of bigquery.user
plus bigquery.transfers.update
on the project and dataset.