Set up an Airflow connection and tasks that run Microsoft SQL Server queries inside your DAGs.
Automating SQL Server jobs in Airflow centralizes scheduling, logging, and monitoring, eliminating Windows Task Scheduler or SQL Agent silos.
Install Apache Airflow ≥2.6, Python ≥3.8, network access to SQL Server, and an account with least-privilege SELECT/INSERT rights on your ecommerce tables.
Run pip install apache-airflow-providers-microsoft-mssql
and restart the Airflow webserver and scheduler.
In the Airflow UI navigate to Connections → +.Choose “Microsoft SQL Server”. Fill Host, Schema (database), Login, Password, Port (default 1433). Save as mssql_ecommerce
.
Add to airflow.cfg
: export AIRFLOW_CONN_MSSQL_ECOMMERCE='mssql+pyodbc://user:pass@server:1433/db?driver=ODBC+Driver+17+for+SQL+Server'
Use MsSqlOperator for ad-hoc SQL, MsSqlHook for Python logic, and MsSqlSensor for data-arrival checks.Details below.
The snippet below extracts daily revenue from Orders
and writes to an audit table.
Parameterize queries with Jinja templates or the parameters
argument to avoid string concatenation and SQL injection.
Reuse one mssql_conn_id
per environment, upgrade the provider with Airflow, enable retries, and store credentials in a secrets backend.
Do not use autocommit=True
with multi-statement transactions unless you wrap them in explicit BEGIN/COMMIT.Avoid ";" inside f-strings—it breaks the operator’s templating.
.
Yes. Pass EXEC dbo.MyProcedure {{ ds }}
in the sql
parameter and set autocommit=True
if the procedure commits.
Use LocalFilesystemToMsSqlOperator
from the same provider, or call BULK INSERT
in an MsSqlOperator step.
No. Airflow on Linux connects over TDS via ODBC. Just install the Microsoft ODBC driver in your Docker image or VM.