Protecting Snowflake usernames, passwords, and keys within dbt Cloud through secret storage, role-based access control, and least-privilege design.
Keep Snowflake secrets safe in dbt Cloud
Learn step-by-step techniques—service accounts, OAuth, scoped roles, environment variables, and secret managers—to secure Snowflake credentials while developing, testing, and deploying dbt projects.
Securing Snowflake credentials in dbt Cloud is the practice of storing and transmitting usernames, passwords, OAuth tokens, or key-pair secrets so that they are never exposed to unauthorized users or systems during dbt development, testing, and production runs.
dbt Cloud automates model builds, tests, and deployments. Each run needs to log in to your Snowflake warehouse. If credentials leak, attackers can access data, rack up compute bills, or tamper with analytics. A single compromised password can cascade across environments because dbt Cloud often executes with elevated roles. Correct credential management therefore protects:
dbt Cloud supports three authentication patterns for Snowflake:
Within the dbt Cloud UI you define an Environment. Each environment has a Connection
object that stores credentials in an encrypted database using AWS KMS (for US-hosted accounts) or Google KMS (for EU). Secrets never appear in the REST API and can be viewed only by users with the Manage Environments permission.
For maximum control you can omit credentials from the UI entirely and load them at runtime through Environment Variables
. dbt Cloud injects variables into the execution container where profiles.yml
can reference them via Jinja:
{% raw %}
outputs:
prod:
type: snowflake
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
user: "{{ env_var('SNOWFLAKE_USER') }}"
role: "{{ env_var('SNOWFLAKE_ROLE') }}"
database:"ANALYTICS"
warehouse:"TRANSFORMING"
authenticator: externalbrowser # if using OAuth
{% endraw %}
Variables are encrypted at rest and masked in logs.
Even a perfectly encrypted secret is dangerous if it unlocks a SYSADMIN
role. Follow Snowflake’s RBAC model:
DBT_CLOUD_ROLE
granting only USAGE
on warehouse, database, schemas, and CREATE
/MODIFY
on objects dbt owns.DBT_DEV_ROLE
) and production (DBT_PROD_ROLE
) environments.Automate security tasks with your platform’s secrets manager (AWS Secrets Manager, Azure Key Vault, GCP Secret Manager):
Combine dbt Cloud job run logs with Snowflake’s ACCOUNT_USAGE
views (LOGIN_HISTORY
, QUERY_HISTORY
) to verify that only expected roles are logging in from dbt Cloud IP ranges. Enable Snowflake ALERT
s or an external SIEM to trigger notifications on anomalies.
DBT_PROD
user:CREATE USER DBT_PROD PASSWORD = 'temporary' DEFAULT_ROLE = DBT_PROD_ROLE MUST_CHANGE_PASSWORD = TRUE;
ssh-keygen -t rsa -b 2048 -m PEM -f dbt_prod_key -N ""
dbt_prod_key.pub
to Snowflake:alter user DBT_PROD set rsa_public_key = 'MIIBIjANBgkqh...';
dbt_prod_key
private key in AWS Secrets Manager./accounts/<id>/jobs/<id>
endpoint to update the environment variable SNOWFLAKE_PRIVATE_KEY
.profiles.yml
reference the key:{% raw %}
private_key_path: "{{ env_var('SNOWFLAKE_PRIVATE_KEY_PATH') }}"
private_key_passphrase: "{{ env_var('SNOWFLAKE_PRIVATE_KEY_PASSPHRASE') }}"
{% endraw %}
DBT_PROD
only.profiles.yml
Committed to GitWhy it’s wrong: Anyone with repo access can read secrets.
Fix: Git-ignore profiles.yml
and inject values via environment variables.
Why it’s wrong: If dbt’s role has ACCOUNTADMIN
, a leak is catastrophic.
Fix: Create a scoped service role with only the grants dbt needs.
Why it’s wrong: Static keys accumulate risk over time.
Fix: Automate rotation with your cloud secrets manager and dbt Cloud API.
While Galaxy is primarily a modern SQL editor rather than a deployment platform, many teams pair Galaxy for interactive query authoring with dbt Cloud for scheduled transformations. Galaxy never stores warehouse credentials on its servers; all connections remain local in the desktop app’s encrypted keychain. When you paste the same Snowflake service account details that dbt Cloud uses, Galaxy respects your existing RBAC model—so you can explore data quickly without creating a second set of secrets.
Securing Snowflake credentials in dbt Cloud is equal parts technical configuration and operational discipline. By combining secret managers, environment variables, least privilege roles, and routine auditing, you can reap the productivity benefits of dbt Cloud while keeping your Snowflake account locked down.
dbt Cloud automates model builds that directly touch production data in Snowflake. If the credentials embedded in those jobs leak, attackers can exfiltrate data, waste compute credits, or compromise analytic results. A robust security approach protects sensitive information, preserves compliance, and ensures that automated transformations run safely across environments.
Key-pair authentication or external OAuth are the safest because they avoid static passwords and can be rotated or revoked quickly.
Yes. Create separate dbt Cloud Environments, each with its own connection and Snowflake role. This enforces least privilege and reduces blast radius.
Security teams typically rotate every 30–90 days. Automate this with a secrets manager and the dbt Cloud REST API to avoid manual work.
No. Galaxy keeps connection details in your local encrypted keychain and never transmits them to its servers, aligning with the same least-privilege principles you use in dbt Cloud.