Auditing access in BigQuery lets you query INFORMATION_SCHEMA views or Cloud Audit Logs to see who read or modified your data.
Auditing answers who accessed the Customers
, Orders
, Products
, or OrderItems
tables, when they did it, and what statement they ran. Security teams use these answers for compliance, incident response, and cost control.
BigQuery writes job-level metadata to region-*.INFORMATION_SCHEMA.JOBS_BY_PROJECT
(180-day retention) and detailed DATA_READ / DATA_WRITE logs to Cloud Logging (bigquery.googleapis.com%2Fdata_access
).Both are queryable with SQL.
Filter by creation_time
, user_email
, and referenced_tables
to list recent reads on the Orders
table.
SELECT
user_email,
referenced_tables,
statement_type,
creation_time
FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT
WHERE creation_time >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 7 DAY)
AND ARRAY_TO_STRING(referenced_tables, ',') LIKE '%Orders%'
ORDER BY creation_time DESC;
Use the exported log table to inspect deeper history (up to the retention you configure).
SELECT
protopayload_auditlog.authenticationInfo.principalEmail AS user_email,
protopayload_auditlog.serviceData.jobCompletedEvent.job.jobConfiguration.query AS query_text,
timestamp
FROM `my-project.logs.region_us.bigquery_datalake`
WHERE logName = 'projects/my-project/logs/bigquery.googleapis.com%2Fdata_access'
AND protopayload_auditlog.serviceData.jobCompletedEvent.job.jobStatistics.referencedTables LIKE '%Products%'
AND timestamp >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 30 DAY)
ORDER BY timestamp DESC;
Focus on user_email
, statement_type
, query_text
, creation_time
, and referenced_tables
.Together they tell the who, what, and when.
Enable DATA_READ/WRITE logs, centralize them in a dedicated project, retain for ≥400 days, and schedule queries that flag unexpected reads of the Customers.email
column. Send alerts through Pub/Sub or Slack.
Without exporting data_access logs you only get the default 30-day retention, making historical audits impossible.Enable a sink to BigQuery or GCS early.
Omitting a timestamp
/creation_time
predicate forces BigQuery to scan months of logs, driving up costs. Always filter by a reasonable interval first and widen only when needed.
You pay standard BigQuery query bytes scanned. Storing logs in a separate project helps track and predict this spend.
Cloud Audit Logs show referenced columns only when column-level security is enabled.Otherwise, you see table-level granularity.
INFORMATION_SCHEMA keeps 180 days, while Cloud Logging keeps 30 days by default. Export logs to BigQuery or Cloud Storage for longer retention.
.
You pay normal BigQuery query charges based on bytes scanned.
Only if column-level access controls are configured; otherwise, logs show tables.
INFORMATION_SCHEMA keeps 180 days; Cloud Logging keeps 30 days unless you export it elsewhere.