Context is everything an agent must know to turn a natural-language question into a correct query or action: table names, column meanings, data types, relationships, and approved business definitions like “active user.”
The agent automatically reads database catalogs to learn table structures, primary keys, and foreign keys. Embeddings of this metadata are stored so the model can reference them when generating SQL.
When the user asks a question, a lightweight search pulls the most relevant schema snippets, query examples, and documentation into the model prompt. The LLM reasons over this retrieved material instead of guessing.
A semantic layer translates business terms to vetted SQL snippets. By retrieving these snippets at runtime, the agent inherits your company’s logic without retraining.
Business logic lives in endorsed SQL queries, dbt models, data contracts, and documentation. Storing these assets in a version-controlled hub lets an agent reference them dynamically. No if-else tree is required because the logic is expressed once in SQL and reused through retrieval.
Galaxy centralizes schema metadata, endorsed queries, and a semantic layer in one workspace. Its context-aware AI copilot feeds these assets to the underlying LLM via RAG, so the model produces SQL that mirrors your actual definitions. Engineers can endorse or update a query and Galaxy instantly makes that new logic available to every AI session, eliminating stale rules.
- Store all trusted queries and metrics in a single searchable repository (Galaxy Collections or Git).
- Add rich column and table descriptions; the agent’s embeddings are only as good as the text you provide.
- Version and review every change so the semantic layer never drifts.
- Use role-based access to prevent the agent from surfacing sensitive tables to unauthorized users.
- Log agent prompts and outputs to audit how business logic is applied in production.
You can, but fine-tuning is costly and slow to update. Retrieval keeps the model lightweight and lets you ship new business rules in minutes rather than retraining a custom model.
AI agents do not “guess” your business rules. They reference them. By pairing retrieval-augmented generation with a centralized semantic layer inside Galaxy, you let the model learn context on demand and keep pace with every schema change-no sprawling rule engine required.
Why do AI copilots need a semantic layer?;What is retrieval-augmented generation?;How to feed dbt metadata to an AI agent;Best practices for context-aware SQL generation
Check out the hottest SQL, data engineer, and data roles at the fastest growing startups.
Check outCheck out our resources for beginners with practice exercises and more
Check outCheck out a curated list of the most common errors we see teams make!
Check out