Shows how to authenticate and run BigQuery SQL from the command line or a graphical SQL editor.
Run gcloud auth login
and follow the browser prompt. For CI/CD, prefer gcloud auth activate-service-account --key-file=/path/key.json
. Both commands store credentials that the bq
CLI automatically reuses.
Run gcloud config set project my-gcp-project
. A default project prevents No project specified
errors in subsequent bq
commands.
Use bq query
followed by a quoted statement. Example:bq query --use_legacy_sql=false 'SELECT * FROM `shop.Orders` LIMIT 10'
. Add --format=prettyjson
to return JSON and --location=US
if your dataset is region-specific.
In Galaxy, click “Add Connection → BigQuery”. Upload your service-account JSON, then choose the default project and billing location. Galaxy auto-discovers datasets and provides IntelliSense for tables such as Customers
and Orders
.
Create a PostgreSQL foreign data wrapper
using Cloud SQL Federation or the Simba ODBC driver. Map external BigQuery tables, then query them just like local tables, e.g. SELECT * FROM bigquery.orders LIMIT 10;
.
Grant the key only roles/bigquery.user
and rotate it at least every 90 days. Store it in a secrets manager, never in source control.
Run bq query --format=csv --use_legacy_sql=false 'SELECT * FROM `shop.Products`' > products.csv
. The file streams directly to your terminal’s working directory.
No. The bq
CLI ships with the Cloud SDK. After installing gcloud you automatically have the latest bq version.
Yes. Set GOOGLE_APPLICATION_CREDENTIALS=/path/key.json
to make both the bq CLI and most GUIs pick up the service-account file without manual upload.
Add --maximum_bytes_billed=1000000000
(≈1 GB) to prevent accidental large scans. The query fails if it exceeds the limit.