Creates a dbt profile that allows the dbt CLI to run transformations against Snowflake.
Run pip install dbt-snowflake
. The adapter supplies the Snowflake connection and pushes compiled SQL back to Snowflake’s warehouse.
Create ~/.dbt/profiles.yml
. Provide account
, user
, password
or authenticator
, role
, warehouse
, database
, and schema
. Keep secrets in environment variables for security.
profiles.yml
entrysnowflake:
target: dev
outputs:
dev:
type: snowflake
account: {{ env_var('SNOWFLAKE_ACCOUNT') }}
user: {{ env_var('SNOWFLAKE_USER') }}
password: {{ env_var('SNOWFLAKE_PASSWORD') }}
role: ANALYST
warehouse: TRANSFORMING_WH
database: ANALYTICS
schema: PUBLIC
threads: 4
Create a model file such as models/orders_enriched.sql
. Use dbt’s Jinja ref() to stay database-agnostic.
{{ config(materialized='table') }}
select
o.id as order_id,
c.name as customer_name,
o.order_date,
o.total_amount,
datediff('day', c.created_at, o.order_date) as customer_age_days
from {{ ref('Orders') }} o
join {{ ref('Customers') }} c on o.customer_id = c.id;
Inside the project root, execute dbt debug --target dev
. A green check confirms dbt reaches Snowflake with the supplied credentials.
Compile and execute SQL with dbt run --select orders_enriched
. The resulting table appears in ANALYTICS.PUBLIC
using the warehouse TRANSFORMING_WH
.
Store secrets in a password manager or CI variable store. Use separate warehouses for dev and prod. Apply role-based access so dbt can only write to its target schema.
Yes. Isolating dev workloads prevents long-running transformations from consuming production credits and allows easier cost attribution.
Absolutely. Replace password
with private_key_path
and private_key_passphrase
in profiles.yml, then set authenticator: snowflake_jwt
.
Create multiple outputs (dev, prod) inside the profile and change target:
or pass --target prod
to dbt commands.