Connect dbt to PostgreSQL so you can version-control, test, and document SQL transformations in a repeatable workflow.
Pairing dbt with PostgreSQL turns raw tables into governed, version-controlled models. You keep SQL in Git, schedule repeatable builds, and surface trusted data for dashboards without moving data out of Postgres.
Verify PostgreSQL is reachable, install dbt-postgres (pip install dbt-postgres), and create a service account with CREATE and USAGE on your target schema.
profiles.yml
for Postgres?Create ~/.dbt/profiles.yml
with a Postgres target. The profile name must match profile:
in dbt_project.yml
.
ecommerce:
target: dev
outputs:
dev:
type: postgres
host: localhost
user: analytics
password: {{ env_var('PG_PASSWORD') }}
port: 5432
dbname: warehouse
schema: analytics
threads: 4
keepalives_idle: 0
Place a model file such as models/orders_enriched.sql
in your dbt project, then run dbt run
. dbt compiles the model and creates the target table or view in analytics.orders_enriched
.
-- models/orders_enriched.sql
select o.id as order_id,
c.name as customer_name,
o.order_date,
o.total_amount,
date_part('dow', o.order_date) as weekday
from {{ ref('orders') }} o
join {{ ref('customers') }} c on c.id = o.customer_id;
Add YAML tests and descriptions. Run dbt test
to validate uniqueness or foreign keys inside Postgres, and generate docs with dbt docs generate
.
Use incremental models for large tables, apply grants:
in dbt_project.yml
for least privilege, and pin Postgres extensions in a pre-hook
so deployments stay idempotent.
psycopg2.OperationalError: confirm the host, port, and SSL settings. permission denied for schema: grant USAGE and CREATE on the target schema to the dbt role.
1) Seed reference data with dbt seed
. 2) Build models with dbt run --select orders_enriched+
. 3) Run tests. 4) Schedule nightly runs through Cron or Airflow for fresh Postgres insights.
Yes. Add sslmode: require
and related parameters under your Postgres output. dbt passes them to psycopg2.
Absolutely. Add materialized='incremental'
to your model, define a unique_key
, and dbt will merge only new or changed rows.
Store profiles.yml variables in a CI secret store, run dbt run
in a container, and point the profile to your production Postgres host.