MCP Library updated 16 min read

PostgreSQL MCP Server: AI-Powered Database Queries & Analysis

Connect PostgreSQL to AI assistants. Learn how to query databases, explore schemas, and analyze data using natural language with Claude, Cursor, and more.

RP

Rajesh Praharaj

Sep 7, 2025 · Updated Dec 28, 2025

PostgreSQL MCP Server: AI-Powered Database Queries & Analysis

TL;DR - PostgreSQL MCP Quick Start

Talk to your database in plain English - Generate and execute SQL queries with AI.

🆕 2025: PostgreSQL MCP now part of the Agentic AI Foundation. Google’s MCP Toolbox and MindsDB offer enhanced database MCP features! For an introduction to MCP, see the MCP Introduction guide.

Quick Setup:

{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://user:password@localhost:5432/mydb"
      ]
    }
  }
}

What you can do:

  • 🔍 Query: Execute SELECT queries via natural language
  • 📊 Analyze: Get insights from your data
  • 📋 Explore: Discover tables, columns, relationships
  • 📈 Report: Generate formatted reports

Example conversation:

You: Show me the top 10 customers by total order value

Claude: I'll query your database...

        | Customer       | Total Orders | Total Value |
        |----------------|--------------|-------------|
        | Acme Corp      | 47           | $125,430    |
        | TechStart Inc  | 32           | $98,200     |
        | Global Ltd     | 28           | $87,650     |
        ...

💡 Safety: Uses read-only mode by default. Your data is protected from accidental modifications.


Why Natural Language Database Access?

SQL is powerful but requires specific knowledge. PostgreSQL MCP bridges the gap:

Traditional ApproachWith PostgreSQL MCP
Know exact table names”What tables have customer data?”
Write complex JOINs”Show orders with customer names”
Remember SQL syntaxDescribe what you want in English
Debug query errorsAI explains and fixes issues
Format results manuallyGet markdown tables automatically

Use Cases

WhoHow They Use It
AnalystsGenerate reports without SQL knowledge
DevelopersQuick data exploration during debugging
Product ManagersPull metrics for stakeholder meetings
Support TeamsLook up customer information quickly
Data ScientistsExploratory data analysis

For more productivity tips, see the AI for Everyday Productivity guide.


Installation & Configuration

Prerequisites

  • Node.js v18+ (check with node --version)
  • PostgreSQL database (local or remote)
  • MCP-compatible client (Claude Desktop, ChatGPT, Cursor, Copilot, etc.)

💡 2025 Alternatives: Consider Google’s MCP Toolbox for managed solutions or MindsDB MCP for federated database queries.

Connection String Format

PostgreSQL connection strings follow this pattern:

postgresql://[user]:[password]@[host]:[port]/[database]?[options]

Examples:

# Local database
postgresql://postgres:mypassword@localhost:5432/myapp

# Remote database
postgresql://appuser:secret@db.example.com:5432/production

# With SSL required
postgresql://user:pass@host:5432/db?sslmode=require

# Multiple options
postgresql://user:pass@host:5432/db?sslmode=require&connect_timeout=10

Claude Desktop Setup

Add to claude_desktop_config.json:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://user:password@localhost:5432/mydb"
      ]
    }
  }
}

Using Environment Variables

For better security, use environment variables:

{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres"
      ],
      "env": {
        "DATABASE_URL": "postgresql://user:password@localhost:5432/mydb"
      }
    }
  }
}

Multiple Databases

Configure separate servers for different databases:

{
  "mcpServers": {
    "postgres-dev": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-postgres"],
      "env": {
        "DATABASE_URL": "postgresql://dev:devpass@localhost:5432/dev_db"
      }
    },
    "postgres-analytics": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-postgres"],
      "env": {
        "DATABASE_URL": "postgresql://analytics:pass@analytics.example.com:5432/warehouse"
      }
    }
  }
}

Cursor Setup

Add to your project’s .cursor/mcp.json:

{
  "mcpServers": {
    "postgres": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-postgres",
        "postgresql://localhost:5432/myapp"
      ]
    }
  }
}

Verify Installation

Restart your AI client and test:

You: What tables are in the database?

Claude: I found the following tables:
        
        | Table Name | Row Count (approx) |
        |------------|-------------------|
        | users      | 15,420            |
        | orders     | 89,312            |
        | products   | 2,847             |
        | categories | 45                |
        | reviews    | 156,089           |

Available Tools

The PostgreSQL MCP server provides these capabilities:

Schema Inspection

ToolDescriptionExample Prompt
list_tablesList all tables in database”What tables exist?”
describe_tableShow columns and types”Describe the users table”
get_schemaFull database schema”Show the complete database schema”

Query Execution

ToolDescriptionExample Prompt
queryExecute SELECT queries”Show all active users”

Example Schema Output

You: Describe the orders table

Claude: **Table: orders**
        
        | Column       | Type         | Nullable | Default          |
        |--------------|--------------|----------|------------------|
        | id           | integer      | NO       | nextval(...)     |
        | customer_id  | integer      | NO       |                  |
        | order_date   | timestamp    | NO       | CURRENT_TIMESTAMP|
        | status       | varchar(20)  | NO       | 'pending'        |
        | total_amount | decimal(10,2)| NO       |                  |
        | notes        | text         | YES      |                  |
        
        **Indexes:**
        - PRIMARY KEY (id)
        - INDEX idx_orders_customer (customer_id)
        - INDEX idx_orders_date (order_date)
        
        **Foreign Keys:**
        - customer_id → customers(id)

Query Examples

Basic Queries

You: Show me all users who signed up this month

Claude: I'll query users registered in December 2024:

        SELECT * FROM users 
        WHERE created_at >= '2024-12-01' 
        ORDER BY created_at DESC;
        
        | ID   | Name          | Email                | Signed Up  |
        |------|---------------|----------------------|------------|
        | 1542 | Alice Johnson | alice@example.com    | Dec 23     |
        | 1541 | Bob Smith     | bob@company.com      | Dec 22     |
        | 1540 | Carol White   | carol@startup.io     | Dec 22     |
        ...
        
        Found 47 new users this month.

Aggregations

You: What's our total revenue per month for this year?

Claude: Here's your monthly revenue for 2024:

        SELECT 
          DATE_TRUNC('month', order_date) as month,
          COUNT(*) as order_count,
          SUM(total_amount) as revenue
        FROM orders
        WHERE order_date >= '2024-01-01'
        GROUP BY DATE_TRUNC('month', order_date)
        ORDER BY month;
        
        | Month    | Orders | Revenue     |
        |----------|--------|-------------|
        | Jan 2024 | 1,245  | $98,432.50  |
        | Feb 2024 | 1,389  | $112,891.00 |
        | Mar 2024 | 1,567  | $134,250.75 |
        ...
        
        **Summary:** Revenue is up 12% YoY.

JOINs and Relationships

You: Show me orders from the last week with customer and product details

Claude: I'll join the relevant tables:

        SELECT 
          o.id as order_id,
          c.name as customer,
          o.order_date,
          p.name as product,
          oi.quantity,
          oi.unit_price
        FROM orders o
        JOIN customers c ON o.customer_id = c.id
        JOIN order_items oi ON o.id = oi.order_id
        JOIN products p ON oi.product_id = p.id
        WHERE o.order_date >= CURRENT_DATE - INTERVAL '7 days'
        ORDER BY o.order_date DESC;
        
        | Order | Customer    | Date  | Product        | Qty | Price  |
        |-------|-------------|-------|----------------|-----|--------|
        | 8921  | Acme Corp   | Dec 23| Widget Pro     | 5   | $49.99 |
        | 8921  | Acme Corp   | Dec 23| Gadget Plus    | 2   | $79.99 |
        | 8920  | TechStart   | Dec 22| Widget Basic   | 10  | $29.99 |
        ...

Data Analysis Workflows

Workflow 1: Ad-Hoc Reporting

Generate reports without extensive SQL knowledge. For more on AI-powered workflows, see the AI-Powered Workflows guide.

You: I need a report of our top 10 selling products this quarter, 
     including units sold, revenue, and average order size

Claude: Generating Q4 2024 product performance report...

        **Top 10 Products - Q4 2024**
        
        | Rank | Product           | Units | Revenue    | Avg Order |
        |------|-------------------|-------|------------|-----------|
        | 1    | Widget Pro        | 2,847 | $142,350   | 3.2 units |
        | 2    | Gadget Plus       | 2,134 | $128,040   | 2.8 units |
        | 3    | Super Service     | 1,989 | $99,450    | 1.0 units |
        | 4    | Basic Bundle      | 1,756 | $87,800    | 4.1 units |
        ...
        
        **Insights:**
        - Widget Pro dominates with 18% of total revenue
        - Service products have lower average order qty (1.0)
        - Bundle products show highest items per order

Workflow 2: Data Quality Check

You: Check my database for data quality issues:
     - NULL values where shouldn't be
     - Duplicate records
     - Orphaned foreign keys

Claude: Running data quality analysis...

        **Data Quality Report**
        
        ⚠️ **NULL Values Found:**
        | Table    | Column    | NULL Count | % of Total |
        |----------|-----------|------------|------------|
        | orders   | ship_date | 234        | 2.1%       |
        | customers| phone     | 1,847      | 12.3%      |
        
        ❌ **Duplicate Records:**
        | Table    | Duplicates | Based On           |
        |----------|------------|-------------------|
        | products | 3          | name + category   |
        | customers| 12         | email             |
        
        🔗 **Orphaned Records:**
        | Table       | Orphans | Missing FK    |
        |-------------|---------|---------------|
        | order_items | 7       | product_id    |
        | reviews     | 23      | customer_id   |
        
        Would you like SQL to fix these issues?

Workflow 3: Trend Analysis

You: Analyze our customer signup trends. 
     Show weekly signups for the last 3 months with week-over-week change

Claude: Analyzing signup trends...

        **Weekly Signup Trends (Last 3 Months)**
        
        | Week       | Signups | Change  | Trend |
        |------------|---------|---------|-------|
        | Dec 16-22  | 342     | +12.5%  | 📈    |
        | Dec 9-15   | 304     | +8.2%   | 📈    |
        | Dec 2-8    | 281     | -3.1%   | 📉    |
        | Nov 25-Dec1| 290     | +15.0%  | 📈    |
        | Nov 18-24  | 252     | +4.1%   | 📈    |
        ...
        
        **Key Insights:**
        - Average weekly signups: 287
        - Best week: Dec 16-22 (342 signups)
        - Growth trend: Positive overall (+8% avg)
        - Notable spike: Black Friday week (+15%)

Workflow 4: Customer Segmentation

You: Segment our customers by purchase behavior:
     - One-time buyers
     - Repeat customers (2-5 orders)
     - Loyal customers (6+ orders)
     Include average order value for each segment

Claude: Segmenting customers by purchase behavior...

        **Customer Segmentation Analysis**
        
        | Segment          | Customers | % Total | Avg Orders | Avg Value |
        |------------------|-----------|---------|------------|-----------|
        | One-time buyers  | 8,432     | 58%     | 1.0        | $45.20    |
        | Repeat (2-5)     | 4,891     | 34%     | 3.2        | $67.80    |
        | Loyal (6+)       | 1,156     | 8%      | 12.4       | $89.50    |
        
        **Revenue Contribution:**
        - One-time: $380,886 (25%)
        - Repeat: $1,065,021 (45%)  
        - Loyal: $1,283,334 (30%)
        
        **Insight:** Loyal customers are 8% of users but 30% of revenue.
        Focus on converting one-time buyers to repeat customers.

Example Prompts

Schema Exploration

# Basic structure
"What tables are in this database?"
"Describe the structure of the orders table"
"What columns does the users table have?"

# Relationships
"Show me all foreign key relationships"
"How are customers and orders related?"
"What tables reference the products table?"

# Indexes
"What indexes exist on the orders table?"
"Are there indexes on commonly queried columns?"

Simple Queries

# Filtering
"Show users who signed up in the last 7 days"
"Find orders with status 'pending'"
"Get products with price over $100"

# Sorting
"List the 10 most recent orders"
"Show products sorted by popularity"
"Get oldest unresolved support tickets"

# Counting
"How many customers do we have?"
"Count orders per status"
"How many products in each category?"

Complex Analysis

# Aggregations
"Calculate average order value by customer segment"
"What's the monthly recurring revenue trend?"
"Show revenue by product category"

# Comparisons
"Compare this month's sales to last month"
"Which products perform better on weekends?"
"Compare customer acquisition by source"

# Cohort analysis
"Show retention rates by signup month"
"How does first-purchase value affect lifetime value?"
"Track customer behavior by acquisition channel"

Data Investigation

# Finding issues
"Find orders with negative amounts"
"Are there customers with no email addresses?"
"Find products that have never been ordered"

# Debugging
"Show the last 10 failed transactions"
"What was the sequence of events for order #12345?"
"Find all orders affected by the bug on December 15th"

# Auditing
"Who modified the pricing table this week?"
"Show login attempts for user admin@company.com"
"List all records changed in the last hour"

Report Generation

# Regular reports
"Generate the daily sales summary"
"Create a weekly inventory report"
"Prepare the monthly executive dashboard data"

# Custom reports
"Build a customer lifetime value report"
"Create a cohort analysis by signup source"
"Generate an inventory aging report"

# Export-ready
"Format results as CSV for export"
"Create a pivot table of sales by region and product"
"Generate data for a chart showing daily active users"

Read-Only Safety

PostgreSQL MCP operates in read-only mode by default for safety.

What’s Allowed

✅ Allowed❌ Blocked
SELECT queriesINSERT statements
Schema inspectionUPDATE statements
EXPLAIN queriesDELETE statements
View definitionsDROP commands
Index informationCREATE commands
TRUNCATE

Why Read-Only?

You: Delete all orders from 2020

Claude: I can't execute DELETE statements. The PostgreSQL MCP 
        server operates in read-only mode to protect your data.
        
        If you need to delete records, please use:
        1. Your database admin tool (pgAdmin, DBeaver)
        2. Direct psql connection with write permissions
        3. Your application's admin interface
        
        I can help you identify which records would be affected:
        
        SELECT COUNT(*) FROM orders 
        WHERE order_date < '2021-01-01';
        
        Result: 12,847 orders would be affected.

If You Need Write Access

For advanced use cases requiring writes, you can:

  1. Use a dedicated write server (not recommended for safety)
  2. Generate SQL for manual execution
  3. Use application-level tools with proper safeguards
You: I need to update customer emails

Claude: I can't execute UPDATE directly, but I can generate 
        the SQL for you to review and run:
        
        -- Update customer emails (REVIEW BEFORE RUNNING)
        UPDATE customers 
        SET email = LOWER(TRIM(email))
        WHERE email != LOWER(TRIM(email));
        
        -- Affected rows estimate:
        -- SELECT COUNT(*) = 234 customers would be updated
        
        Run this in your database tool after reviewing.

Performance & Optimization

Query Optimization

You: My query is running slowly. Can you optimize it?

Claude: Let me analyze your query performance...

        **Original Query:**
        SELECT * FROM orders WHERE customer_id = 123;
        -- Execution time: 2.4 seconds
        
        **Analysis:**
        EXPLAIN ANALYZE shows:
        - Sequential scan on orders (slow)
        - No index on customer_id
        
        **Recommendations:**
        1. Create index:
           CREATE INDEX idx_orders_customer ON orders(customer_id);
        
        2. Only select needed columns:
           SELECT id, order_date, total FROM orders...
        
        3. Add LIMIT for previews:
           ... LIMIT 100;
        
        After index, estimated time: ~10ms

Best Practices

PracticeWhy
Use LIMIT for explorationPrevent loading millions of rows
Select specific columnsFaster than SELECT *
Add indexes for common queriesSpeed up filtered queries
Use read replicasDon’t impact production
Set query timeoutsPrevent runaway queries

Connection Settings

Add timeout options for safety:

{
  "env": {
    "DATABASE_URL": "postgresql://user:pass@host:5432/db?connect_timeout=10&statement_timeout=30000"
  }
}
OptionDescription
connect_timeout=10Fail if can’t connect in 10 seconds
statement_timeout=30000Abort queries longer than 30 seconds

Security Best Practices

Credential Security

✅ Do❌ Don’t
Use environment variablesHardcode passwords in config
Create read-only DB userUse admin credentials
Restrict DB permissionsGrant full access
Use connection poolerExpose DB directly
Rotate credentials regularlyShare credentials

For more on AI security best practices, see the Understanding AI Safety, Ethics, and Limitations guide.

Creating a Read-Only User

-- Create read-only user for MCP
CREATE USER mcp_readonly PASSWORD 'secure_password';

-- Grant connect permission
GRANT CONNECT ON DATABASE myapp TO mcp_readonly;

-- Grant read access to all tables
GRANT SELECT ON ALL TABLES IN SCHEMA public TO mcp_readonly;

-- Grant for future tables
ALTER DEFAULT PRIVILEGES IN SCHEMA public
GRANT SELECT ON TABLES TO mcp_readonly;

Connection Security

{
  "env": {
    "DATABASE_URL": "postgresql://mcp_readonly:pass@host:5432/db?sslmode=require"
  }
}
SSL ModeDescription
disableNo SSL (not recommended)
requireSSL required
verify-caSSL + verify certificate
verify-fullSSL + verify cert + hostname

Troubleshooting

Issue: “Connection refused”

Symptoms: Can’t connect to database

Solutions:

CauseSolution
Wrong host/portVerify connection details
Firewall blockingAllow port 5432
PostgreSQL not runningStart the service
Wrong credentialsCheck username/password

Issue: “Authentication failed”

Symptoms: “password authentication failed”

Solutions:

CauseSolution
Wrong passwordVerify credentials
User doesn’t existCreate database user
No login privilegeGRANT LOGIN to user
pg_hba.conf blockingCheck authentication config

Issue: “Query timeout”

Symptoms: Queries taking too long

Solutions:

You: My query keeps timing out

Claude: Let's optimize:

        1. Check query plan:
           EXPLAIN ANALYZE [your query]
        
        2. Add missing indexes:
           CREATE INDEX ON table(column);
        
        3. Add LIMIT for testing:
           SELECT * FROM large_table LIMIT 100;
        
        4. Increase timeout (if needed):
           ?statement_timeout=60000

Issue: “SSL connection required”

Symptoms: Connection fails with SSL error

Solutions:

{
  "env": {
    "DATABASE_URL": "postgresql://user:pass@host:5432/db?sslmode=require"
  }
}

Combining with Other MCPs

PostgreSQL + Filesystem

Export query results to files:

You: Query the monthly sales data and save it as a CSV 
     in my reports folder

Claude: [Uses PostgreSQL to query data]
        [Uses Filesystem to write CSV]
        
        Exported 12 months of sales data to:
        /Users/me/reports/sales_2024.csv
        
        File includes: month, revenue, orders, customers

PostgreSQL + Memory

Remember database context:

You: Remember this database schema for future questions

Claude: [Uses PostgreSQL to inspect schema]
        [Uses Memory to store context]
        
        I've memorized your database structure:
        - 15 tables
        - Key relationships
        - Common query patterns
        
        I'll reference this in our future conversations.

PostgreSQL + Slack

Share reports with team:

You: Run the weekly sales report and post it to #sales-team

Claude: [Uses PostgreSQL to generate report]
        [Uses Slack to post message]
        
        Posted weekly sales report to #sales-team:
        - Total revenue: $47,230
        - vs last week: +8.3%
        - Top product: Widget Pro

ServerComplements PostgreSQL By…
Filesystem MCPSaving query results
Memory MCPRemembering analysis context
Slack MCPSharing reports with team
GitHub MCPVersion controlling SQL queries

Summary

The PostgreSQL MCP Server enables natural language database access:

  • Read-only by default for safety
  • Natural language queries without SQL knowledge
  • Schema exploration to understand your data
  • Complex analysis with JOINs and aggregations
  • Formatted results in markdown tables
  • Universal support - Works with Claude, ChatGPT, Copilot, Gemini

2025 Ecosystem:

  • PostgreSQL MCP - Core reference server (Agentic AI Foundation)
  • Google MCP Toolbox - Managed database MCP with connection pooling
  • MindsDB MCP - Federated queries across multiple databases
  • pg_ai_query - Direct PostgreSQL extension for AI queries

Best use cases:

  • Ad-hoc reporting and analysis
  • Data exploration and discovery
  • Quick metrics lookups
  • Data quality investigations
  • Generating SQL for review

Security checklist:

  • ☐ Create dedicated read-only database user
  • ☐ Use environment variables for credentials
  • ☐ Enable SSL for remote connections
  • ☐ Set query timeouts
  • ☐ Never commit credentials to version control

Next: Learn about Notion MCP Server → for workspace automation.


Questions about PostgreSQL MCP? Check the source code on GitHub or join the MCP Discord.

Was this page helpful?

Let us know if you found what you were looking for.