Postgres Database Chat Assistant – Full Workflow Documentation
What We’re Building
We’re creating a Postgres Database Chat Assistant that lets users ask natural-language questions and get instant answers from your Postgres database. Instead of writing SQL, team members simply chat (via a front end or an API call), and the assistant:
- Understands the user’s intent (e.g., “How many rows are in the users table?”)
- Generates the correct SQL query on the fly
- Executes that SQL against Postgres
- Formats and returns a friendly, human-readable response
All of this happens without users touching SQL or maintaining custom back-end code—just n8n nodes and an AI Agent.
How It Works
-
Chat Trigger → n8n
- A user’s message (JSON with a
messages
array) arrives at an n8n webhook.
- A user’s message (JSON with a
-
AI Agent (LangChain) Interprets Intent
-
The AI Agent examines the user’s text and decides which tool to call:
- List schemas/tables
- Show table definition (columns, types, keys)
- Run a custom SQL query
-
-
Tool Nodes Interact with Postgres
- Get DB Schema and Tables List: returns every
schema.table
pair - Get Table Definition: returns column names, data types, nullability, and foreign keys for a specific table
- Execute SQL Query: runs the AI-generated SQL and returns raw rows
- Get DB Schema and Tables List: returns every
-
AI Agent Formats Results
- Once a tool returns results, the AI Agent transforms raw JSON into a readable summary or a markdown-style table and returns it.
-
Chat History (Memory Buffer)
- Keeps the last few messages (default: 5) in context, so follow-up queries like “What about July?” work seamlessly.
What This Unlocks
-
Instant Insights
Anyone can get data by simply asking in plain English—no SQL knowledge required. -
Always Up to Date
Any schema changes are automatically reflected when the AI Agent calls the schema‐listing or definition tools. -
Scalable & Safe
The AI Agent only runs safe, read‐only queries by default. No accidentalDROP TABLE
or destructive commands. -
Auditability
Every query can be logged (via an optional Google Sheets or Airtable node), so you know who asked what and when. -
Conversational Experience
Follow-up questions naturally refer back to earlier context, making the interaction feel like a real conversation.
Use Cases
-
Data Exploration
A product manager asks, “Which tables contain a column named ‘email’?” The assistant locates all tables with anemail
column by combining schema and column metadata. -
Ad-Hoc Reporting
Sales asks, “Show me total revenue by month for the past year.” The assistant generates and runs:SELECT DATE_TRUNC('month', order_date) AS month, SUM(total_amount) AS revenue FROM sales.orders WHERE order_date >= NOW() - INTERVAL '1 year' GROUP BY month ORDER BY month;
and returns a month-by-month revenue breakdown.
-
Quick Troubleshooting
A developer says, “Show me any rows in users where is_active = false, limit to 5.” The assistant runs:SELECT * FROM public.users WHERE is_active = false LIMIT 5;
and displays a sample of inactive users.
-
Schema Discovery & Onboarding
A new analyst asks, “What tables do we have in the inventory schema?” The assistant lists allinventory.*
tables without requiring the analyst to explore pgAdmin or psql.
Implementation Steps
1. Create a New Workflow in n8n
- Log into your n8n instance.
- Click Workflows → New Workflow (or the
icon).
- Name it Postgres Database Chat Assistant.
2. Add & Configure Nodes
2.1 Chat Trigger
-
Node: Chat Trigger
-
Purpose: Receives incoming HTTP POST requests containing chat messages.
-
Configuration:
- Webhook Path: e.g.
/postgres-chat
- Method: POST
- Webhook Path: e.g.
2.2 AI Agent
-
Node: AI Agent (LangChain Agent)
-
Purpose:
- Reads the user’s message.
- Determines if it needs to call one of the Postgres tools (schema list, table definition, or custom SQL).
- After getting results from a tool, formats them into readable text.
-
Configuration:
-
Agent Type:
openAiFunctionsAgent
-
System Prompt (copy exactly):
You are DB assistant. You need to run queries in the database aligned with user requests. Run custom SQL queries to aggregate or fetch data and respond to the user. Make sure every table name uses its schema prefix (e.g., public.users) by using the ‘Get DB Schema and Tables List’ tool whenever necessary. If the user needs column info, use the ‘Get Table Definition’ tool by passing schema_name and table_name. Tools available: - Execute SQL Query: Runs any SQL query. - Get DB Schema and Tables List: Returns every schema.table pair. - Get Table Definition: Returns column names and metadata for a given table.
-
2.3 OpenAI Chat Model
-
Node: OpenAI Chat Model (LM Chat OpenAI)
-
Purpose: Provides the LLM (e.g.,
gpt-4o-mini
) that the AI Agent uses to interpret user intent and generate SQL or tool calls. -
Configuration:
- Model:
gpt-4o-mini
(or another chat-capable model) - API Credential: Link your OpenAI API key
- Model:
2.4 Chat History (Memory Buffer)
- Node: Memory Buffer Window
- Purpose: Retains the last 5 messages for context, enabling follow-up queries.
- Configuration: Default settings (window size = 5)
2.5 Execute SQL Query
-
Node: Postgres Tool
-
Purpose: Runs arbitrary SQL queries generated by the AI Agent.
-
Configuration:
- Operation:
executeQuery
- Query Field:
{{ $fromAI("sql_query", "SQL Query") }}
- Description: “Runs the SQL query returned by the AI Agent.”
- Postgres Credential: Link your Postgres credential (host, port, database, user, password)
- Operation:
2.6 Get DB Schema and Tables List
-
Node: Postgres Tool
-
Purpose: Returns all
(schema, table_name)
pairs. AI Agent uses this to know which tables exist. -
Configuration:
-
Operation:
executeQuery
-
Query:
SELECT table_schema, table_name FROM information_schema.tables WHERE table_type = 'BASE TABLE' AND table_schema NOT IN ('pg_catalog', 'information_schema') ORDER BY table_schema, table_name;
-
Description: “List all schemas and tables.”
-
Postgres Credential: Same as above
-
2.7 Get Table Definition
-
Node: Postgres Tool
-
Purpose: Returns columns, data types, nullability, defaults, foreign keys for a specified table. AI Agent calls this to answer “What columns does X table have?” queries.
-
Configuration:
-
Operation:
executeQuery
-
Query:
SELECT c.column_name, c.data_type, c.is_nullable, c.column_default, tc.constraint_type, ccu.table_name AS referenced_table, ccu.column_name AS referenced_column FROM information_schema.columns c LEFT JOIN information_schema.key_column_usage kcu ON c.table_name = kcu.table_name AND c.column_name = kcu.column_name LEFT JOIN information_schema.table_constraints tc ON kcu.constraint_name = tc.constraint_name AND tc.constraint_type = 'FOREIGN KEY' LEFT JOIN information_schema.constraint_column_usage ccu ON tc.constraint_name = ccu.constraint_name WHERE c.table_name = '{{ $fromAI("table_name") }}' AND c.table_schema = '{{ $fromAI("schema_name") }}' ORDER BY c.ordinal_position;
-
Description: “Get column metadata for the specified table.”
-
Postgres Credential: Same as above
-
2.8 Sticky Notes (Optional Documentation)
Place one or more Sticky Note nodes on the canvas to document workflow logic:
-
Sticky Note A (Setup Instructions)
## 🚀 Postgres Database Chat Assistant ### 📝 Setup 1. Enter your Postgres and OpenAI credentials 2. Click “Chat” to query your database 3. Activate the workflow to publish the chat
-
Sticky Note B (Tools Overview)
### 🛠️ Tools Used - **Execute SQL Query**: Runs AI-generated SQL - **Get DB Schema & Tables**: Lists all schemas and tables - **Get Table Definition**: Returns column names and metadata
-
Sticky Note C (Memory Buffer)
👆 Use “Memory Buffer” to keep the last 5 messages in context.
-
Sticky Note D (Example Models)
### 🤖 Chat Model Using gpt-4o-mini (fast, cost-effective) Other options: gpt-4, gpt-3.5-turbo, llama-2-13b-chat, etc.
3. Wire Up the Connections
-
When Chat Message Received → AI Agent
- Connect the “main” output of When Chat Message Received to the “main” input of AI Agent.
-
OpenAI Chat Model → AI Agent
- Connect OpenAI Chat Model “ai_languageModel” output to AI Agent “ai_languageModel” input.
-
Chat History → AI Agent
- Connect Chat History “ai_memory” output to AI Agent “ai_memory” input.
-
Get DB Schema and Tables List → AI Agent
- Connect Get DB Schema and Tables List “ai_tool” output to AI Agent “ai_tool” input.
-
Get Table Definition → AI Agent
- Connect Get Table Definition “ai_tool” output to AI Agent “ai_tool” input.
-
Execute SQL Query → AI Agent
- Connect Execute SQL Query “ai_tool” output to AI Agent “ai_tool” input.
4. Configure Credentials
-
Postgres Credential
- Go to Credentials → New Credential → Postgres
- Name it (e.g., PG Chat Assistant), fill in host, port, database, username, password, and SSL details if needed. Save.
- In each Postgres Tool node (Execute SQL Query, Get DB Schema and Tables List, Get Table Definition), select this credential.
-
OpenAI Credential
- Go to Credentials → New Credential → OpenAI
- Name it (e.g., OpenAI Chat), paste your API key, and save.
- In the OpenAI Chat Model node, select this credential.
Testing the Chat Assistant
Once credentials and connections are set up, save and activate the workflow. Then test via a webhook call.
A. Using cURL or Postman
-
Get the Webhook URL
- On the When Chat Message Received node, copy the generated webhook URL (e.g.,
https://<your-n8n-domain>/webhook/<uuid>
).
- On the When Chat Message Received node, copy the generated webhook URL (e.g.,
-
Send a Test Request
-
The request body must be:
{ "messages": [ { "role": "user", "content": "List all tables in my database." } ] }
-
Example cURL (replace
<webhook-url>
):curl -X POST <webhook-url> \ -H "Content-Type: application/json" \ -d '{ "messages": [ { "role": "user", "content": "List all tables in my database." } ] }'
-
-
Inspect the Response
- You should receive a JSON object with an “assistant” message listing all
(schema.table)
pairs.
- You should receive a JSON object with an “assistant” message listing all
B. Sample Queries & Expected Responses
-
“List all tables.”
-
Agent calls Get DB Schema and Tables List.
-
Example response:
Here are the tables in your database (schema.table): - public.users - public.orders - sales.transactions - inventory.products
-
-
“What columns are in public.users?”
-
Agent calls Get Table Definition with
schema_name = "public"
,table_name = "users"
. -
Example response:
Columns in public.users: • id (integer, not nullable, primary key) • email (text, not nullable, unique) • name (text, nullable) • created_at (timestamp, not nullable, default now()) • is_active (boolean, not nullable, default true)
-
-
“How many rows are in the orders table?”
-
Agent generates SQL:
SELECT COUNT(*) AS total_orders FROM public.orders;
-
If the result is
{"total_orders": 3450}
, the response will be:There are 3,450 rows in the public.orders table.
-
-
“Show total sales by month for the last year.”
-
Assuming
sales.transactions
hasamount
andcreated_at
. -
Agent runs:
SELECT DATE_TRUNC('month', created_at) AS month, SUM(amount) AS total_sales FROM sales.transactions WHERE created_at >= NOW() - INTERVAL '1 year' GROUP BY month ORDER BY month;
-
Example response:
Total sales by month (last year): • 2024-06: $12,450 • 2024-07: $15,320 • 2024-08: $9,785 • …
-
-
“Show 5 inactive users from public.users.”
-
SQL:
SELECT * FROM public.users WHERE is_active = false LIMIT 5;
-
Example response:
id | email | name | created_at | is_active ----+----------------------+-------------+--------------------+----------- 102 | test1@domain.com | John Doe | 2024-11-05 07:45 | false 158 | example@domain.org | Jane Smith | 2025-01-12 15:30 | false … (3 more rows)
-
Next Steps & Extensions
-
Add Query Logging
-
Insert a Google Sheets or Airtable node after each Postgres Tool node to append:
- Timestamp
- User message
- Generated SQL
- Row count / execution time
-
This provides an audit trail of all queries.
-
-
Restrict Destructive Commands
- To enforce read-only, add a small validation step (e.g., a Function node) before Execute SQL Query that scans the SQL string. Reject if it contains
INSERT
,UPDATE
,DELETE
,DROP
, orALTER
.
- To enforce read-only, add a small validation step (e.g., a Function node) before Execute SQL Query that scans the SQL string. Reject if it contains
-
Enable Charting & Visualization
- After Execute SQL Query, add a Function node that transforms result rows into a JSON structure for a chart library (e.g., Chart.js).
- Return a small HTML snippet with an embedded chart via a Respond to Webhook node set to HTML mode.
-
Role-Based Access & Multi‐Tenant
- Capture an API key or user ID in the incoming request.
- Modify the SQL or apply Postgres row-level security based on that user’s access rights.
- This ensures users only see data they’re permitted to view.
-
Integrate with Your Front End
- Build a simple chat UI (React, Vue, or plain HTML/JS) that sends user messages to the webhook, displays AI Agent responses, and supports streaming if you later enable streaming responses.
- A frontend can also allow optional SQL previews, letting power users inspect or tweak generated SQL before execution.
Conclusion
You now have a complete Postgres Database Chat Assistant in n8n that:
- Receives natural-language queries via a webhook
- Uses an AI Agent to interpret intent and call the correct tool
- Runs safe, read-only SQL against your Postgres database
- Returns friendly, formatted answers to users
- Maintains chat history for follow-up context
This low-code approach empowers non-technical stakeholders to explore and analyze your Postgres data without writing SQL. From data discovery to ad-hoc reporting, your team can get insights in seconds—conversationally.
Feel free to extend this foundation with logging, visualization, multi-user controls, or any integration you need. The possibilities are limitless. Happy querying!
Here is the n8n workflow json for download: