Move SQL Server schemas, data, and procedures into Snowflake using SnowSQL, external stages, and Snowpipe.
Snowflake separates storage and compute, auto-scales, and removes index management. This reduces ops work while boosting concurrency and pay-for-what-you-use costs.
1) Extract data from SQL Server.
2) Land files in cloud storage.
3) Create Snowflake objects.
4) Load data with COPY or Snowpipe.
5) Validate row counts & types.
6) Cut over applications.
Use bcp or Azure Data Factory for TB-scale exports.Export each SQL Server table to compressed CSV or Parquet to minimize network I/O.
bcp dbo.Customers out customers.csv -c -t"," -S sqlserver01 -d shop -U user -P pass
Choose the cloud Snowflake account runs on (S3, Azure Blob, GCS). Organize folders by schema/table for parallel loads.
aws s3 cp customers.csv s3://shop-migration/customers/
Create databases, schemas, tables, and stages.Match SQL Server data types to Snowflake equivalents (e.g., VARCHAR → STRING, DATETIME → TIMESTAMP_NTZ).
CREATE TABLE Customers (id INT, name STRING, email STRING, created_at TIMESTAMP_NTZ);
Use COPY INTO
from an external or internal stage.Define file format once and reuse.
CREATE OR REPLACE FILE FORMAT csv_ff TYPE = 'CSV' FIELD_OPTIONALLY_ENCLOSED_BY='"';CREATE OR REPLACE STAGE s3_stage URL='s3://shop-migration/' CREDENTIALS=(AWS_KEY_ID='…' AWS_SECRET_KEY='…') FILE_FORMAT=csv_ff;COPY INTO Customers FROM @s3_stage/customers/ ON_ERROR='ABORT_STATEMENT';
Configure Snowpipe to watch the storage path and auto-ingest new files. Use an AWS SNS, Azure Event Grid, or GCP Pub/Sub notification channel.
Rewrite T-SQL procedures into Snowflake JavaScript or SQL UDFs.Focus on set-based logic; avoid cursors when possible.
Compare row counts and checksums between SQL Server and Snowflake. Sample query:SELECT COUNT(*), SUM(total_amount) FROM Orders;
After validation, switch connection strings. Keep SQL Server in read-only mode until your Snowflake workload is stable.
Parallelize loads, compress files, use Snowflake multi-table COPY, and enable clustering later to avoid extra cost during ingest.
.
Snowflake partners like Fivetran and HVR provide near-real-time replication without manual extracts.
Yes. Use Change Data Capture (CDC) to push deltas via Snowpipe or partner tools.
Preserve values by defining the Snowflake column as NUMBER
and loading the original IDs; use sequences only for new rows.