How to Migrate from BigQuery to Snowflake

Galaxy Glossary

How do I migrate from BigQuery to Snowflake?

Move tables, views, and pipelines from Google BigQuery into Snowflake using export-to-GCS + Snowflake COPY.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.

Description

How do I plan a BigQuery → Snowflake migration?

List every dataset, table, view, and scheduled query in BigQuery. Tag owners, data size, and refresh cadence. Map each object to a Snowflake database, schema, role, and warehouse. This inventory drives later automation scripts.

What are the end-to-end steps?

1) Export BigQuery tables to Google Cloud Storage (GCS). 2) Set up an external stage in Snowflake that points to the GCS bucket. 3) Create target tables in Snowflake. 4) COPY data in. 5) Re-create views, jobs, and permissions. 6) Validate row counts and checksums.

How do I export BigQuery data to GCS?

Use the EXPORT DATA statement or the bq extract CLI. Partition large tables to avoid >1 GB file limits and enable compression for faster transfer.

What is the Snowflake loading syntax?

Snowflake loads GCS files through CREATE STAGE and COPY INTO. Key options: FILE_FORMAT, ON_ERROR, MATCH_BY_COLUMN_NAME, FORCE, and PURGE. Set MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE to tolerate column order changes.

How can I migrate an ecommerce Orders table?

The example below exports Orders from BigQuery, defines a Snowflake stage, and copies data into the Snowflake ORDERS table.

Step 1 – BigQuery export

EXPORT DATA
OPTIONS (
uri='gs://bq_to_sf/orders/*.parquet',
format='PARQUET',
compression='SNAPPY')
AS
SELECT * FROM `ecom.Orders`;

Step 2 – Create stage in Snowflake

CREATE STAGE gcs_orders_stage
url='gcs://bq_to_sf/orders/'
credentials=(gcs_key_id='$GCS_KEY' gcs_secret_key='$GCS_SECRET')
file_format=(type=parquet);

Step 3 – Create target table

CREATE OR REPLACE TABLE Orders (
id BIGINT,
customer_id BIGINT,
order_date DATE,
total_amount NUMBER(10,2)
);

Step 4 – Load data

COPY INTO Orders
FROM @gcs_orders_stage
MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE
ON_ERROR = ABORT_STATEMENT;

What validations should I run?

Compare row counts (SELECT COUNT(*)) and hashed totals (SUM(total_amount)) between BigQuery and Snowflake. Automate checks in CI pipelines to catch truncation or type mismatches early.

Best practices for large migrations

• Parallelize exports with partition decorators. • Compress files (Parquet or GZIP CSV) to cut egress costs. • Use Snowflake Snowpipe or Streams + Tasks for continuous loads. • Migrate views after base tables so dependencies resolve.

Why How to Migrate from BigQuery to Snowflake is important

How to Migrate from BigQuery to Snowflake Example Usage


-- Complete migration script for Orders table
-- 1. BigQuery side (run in BQ)
EXPORT DATA OPTIONS (
  uri='gs://bq_to_sf/orders/*.parquet',
  format='PARQUET',
  compression='SNAPPY')
AS
SELECT * FROM `ecom.Orders`;

-- 2. Snowflake side (run in Snowflake)
CREATE STAGE gcs_orders_stage
  url='gcs://bq_to_sf/orders/'
  credentials=(gcs_key_id='$GCS_KEY' gcs_secret_key='$GCS_SECRET')
  file_format=(type=parquet);

CREATE OR REPLACE TABLE Orders (
  id           BIGINT,
  customer_id  BIGINT,
  order_date   DATE,
  total_amount NUMBER(10,2)
);

COPY INTO Orders
FROM @gcs_orders_stage
MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE
ON_ERROR = CONTINUE;

How to Migrate from BigQuery to Snowflake Syntax


-- BigQuery export to GCS
EXPORT DATA OPTIONS (
  uri='gs://bq_to_sf/orders/*.parquet',
  format='PARQUET',
  compression='SNAPPY'
) AS
SELECT * FROM `ecom.Orders`;

-- Snowflake stage referencing the GCS bucket
CREATE STAGE gcs_orders_stage
  url='gcs://bq_to_sf/orders/'
  credentials=(gcs_key_id='$GCS_KEY' gcs_secret_key='$GCS_SECRET')
  file_format=(type=parquet);

-- Target table in Snowflake
CREATE OR REPLACE TABLE Orders (
  id           BIGINT,
  customer_id  BIGINT,
  order_date   DATE,
  total_amount NUMBER(10,2)
);

-- Copy files into Snowflake table
COPY INTO Orders
FROM @gcs_orders_stage
MATCH_BY_COLUMN_NAME = CASE_INSENSITIVE
ON_ERROR = ABORT_STATEMENT;

Common Mistakes

Frequently Asked Questions (FAQs)

Can I avoid manual exports for real-time data?

Yes. Use BigQuery Data Transfer Service to write to GCS or leverage Snowpipe with GCS event notifications for near-real-time ingestion.

Do I need to re-write SQL when moving views?

Most ANSI SQL works, but BigQuery arrays and STRUCTs require UNNEST rewrites or Snowflake VARIANT. Test each view and adjust.

How do I secure GCS credentials in Snowflake?

Create a scoped storage integration and reference it in CREATE STAGE so that secrets are managed by Snowflake, not in plain SQL.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie
BauHealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.