How to Migrate from Snowflake to BigQuery in PostgreSQL

Galaxy Glossary

How do I migrate data from Snowflake to BigQuery without data loss?

Exports Snowflake data, transfers it to Google Cloud Storage, then loads it into BigQuery while recreating schemas and preserving data types.

Sign up for the latest in SQL knowledge from the Galaxy Team!
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.

Description

Table of Contents

Why migrate from Snowflake to BigQuery?

Lower cost, deeper GCP integration, and BigQuery’s serverless performance often motivate teams to switch.

Which migration paths exist?

Option 1: Snowflake → GCS (COPY INTO) → BigQuery (bq load). Option 2: Use BigQuery Migration Service for automatic transfers.Option 3: Fivetran, Airbyte, or Dataflow for near-zero-downtime replication.

How do I export tables from Snowflake?

1 – CREATE STAGE pointing to a GCS bucket.
2 – COPY INTO stage in parallel, compressing files.
3 – LIST to verify objects.

How do I load files into BigQuery?

Issue bq load or schedule BigQuery Data Transfer Service.Define explicit schemas to avoid numeric/string drift.

How is schema recreated?

Generate DDL with SHOW CREATE TABLE in Snowflake, convert data types (e.g., NUMBER → NUMERIC, VARIANT → JSON) using a simple Python script or Data Migration Service.

How do I migrate incremental data?

Add LAST_MODIFIED timestamp columns and schedule hourly COPY INTO with PATTERN filter. Use BigQuery’s MERGE to upsert.

Example: migrating the Customers table

Snowflake COPY INTO writes CSV.gz to @gcs_stage/customers/.BigQuery bq load ingests those into my_dataset.Customers, mapping id → INT64, created_at → DATETIME.

What about constraints & indexes?

Re-implement primary-key and unique constraints as NOT ENFORCED in BigQuery, or use dbt tests to validate.

How do I validate row counts?

Compare SELECT COUNT(*) results between Snowflake and BigQuery.Use EXCEPT and CRC32 hashes for spot checks.

How to cut over production?

Run dual-writes, backfill lagging rows, freeze Snowflake, rerun final incremental, switch apps to BigQuery.

Best practices

Compress exports (GZIP), partition large tables by date in BigQuery, script every step with CI, and tag datasets with migration date.

Common mistakes

Skipping type mapping causes STRING blobs. Forgetting to set --field_delimiter="|" creates broken rows.

Key takeaway

Automate, verify, and monitor each stage to ensure a predictable, lossless migration.

.

Why How to Migrate from Snowflake to BigQuery in PostgreSQL is important

How to Migrate from Snowflake to BigQuery in PostgreSQL Example Usage


-- Validate data integrity between warehouses
WITH sf AS (
  SELECT id, MD5(CONCAT_WS('|',id,name,email)) AS hash
  FROM Customers
),
bg AS (
  SELECT id, MD5(CONCAT_WS('|',id,name,email)) AS hash
  FROM `my_dataset.Customers`
)
SELECT 'Mismatched rows' AS issue, COUNT(*) AS total
FROM (
  SELECT id FROM sf
  EXCEPT DISTINCT
  SELECT id FROM bg
)

How to Migrate from Snowflake to BigQuery in PostgreSQL Syntax


-- Step 1: create stage in Snowflake
authenticate using a storage integration
CREATE OR REPLACE STAGE gcs_stage
  URL = 'gcs://ecom-migrations'
  STORAGE_INTEGRATION = gcs_int
  FILE_FORMAT = (TYPE = CSV FIELD_OPTIONALLY_ENCLOSED_BY='"' COMPRESSION=GZIP);

-- Step 2: export Customers table
a) Full export
COPY INTO @gcs_stage/customers/
FROM (SELECT id, name, email, created_at FROM Customers)
MAX_FILE_SIZE = 536870912
HEADER = TRUE;

b) Incremental export (last 1 hour)
COPY INTO @gcs_stage/customers/
FROM (
  SELECT * FROM Customers
  WHERE created_at >= DATEADD(hour,-1,CURRENT_TIMESTAMP())
)
PATTERN = '.*part.*.csv.gz';

-- Step 3: load into BigQuery
bq load --source_format=CSV \
        --field_delimiter="," \
        --skip_leading_rows=1 \
        my_dataset.Customers \
        'gs://ecom-migrations/customers/*.csv.gz' \
        id:INT64,name:STRING,email:STRING,created_at:DATETIME

-- Step 4: verify counts
-- In BigQuery
SELECT COUNT(*) FROM my_dataset.Customers;

-- In Snowflake
SELECT COUNT(*) FROM Customers;

Common Mistakes

Frequently Asked Questions (FAQs)

Is there a fully managed option?

Yes. BigQuery Migration Service automates Snowflake exports, schema conversion, and incremental syncs.

How long does a terabyte migration take?

With parallel COPY INTO (16 threads) and gsutil -m, expect ~45 minutes export and ~30 minutes load, network permitting.

Can I keep Snowflake as a backup?

Absolutely. Freeze the Snowflake warehouse in suspended mode to retain historical data at minimal cost.

Want to learn about other SQL terms?

Trusted by top engineers on high-velocity teams
Aryeo Logo
Assort Health
Curri
Rubie Logo
Bauhealth Logo
Truvideo Logo
Welcome to the Galaxy, Guardian!
Oops! Something went wrong while submitting the form.