Skip to content

Testing Guide

This document describes the test coverage for the MCP server and Natural Language Agents, with a focus on multi-LLM provider support.

Test Structure

Configuration Tests (internal/config/config_test.go)

Comprehensive tests for configuration loading and provider selection:

  • Provider Configuration: Tests for both Anthropic and Ollama providers
  • Validation: Tests for missing required configuration (API keys, models, connection strings)
  • Configuration Priority: Tests for CLI flags > Environment variables > Config file > Defaults
  • Invalid Provider: Tests for unsupported LLM providers
  • Partial Configuration: Tests for config files with only some values set

Coverage: 80.7% of config package

LLM Client Tests (internal/llm/client_test.go)

Anthropic Provider Tests

  • Basic client creation and configuration
  • Successful SQL conversion with mocked API responses
  • API error handling (400, 500 status codes)
  • Empty response handling
  • SQL cleaning (removing markdown, comments, semicolons)
  • Complex query handling with JOINs, GROUP BY, ORDER BY

Ollama Provider Tests

  • Client creation with Ollama provider
  • Configuration validation (requires baseURL and model)
  • Successful SQL conversion using OpenAI-compatible API format
  • API error handling (404, 500 status codes)
  • Empty response handling (no choices in response)
  • SQL cleaning for Ollama responses
  • Complex query handling
  • Unsupported provider error handling

Coverage: 88.7% of llm package

Model Management Tests (cmd/pgedge-pg-mcp-svr/models_test.go)

Tests for Ollama model download functionality:

  • Successful Model Pull: Tests streaming download with progress updates
  • API Errors: Tests 404 and other API error responses
  • Network Errors: Tests connection failures
  • Default URL: Tests that default Ollama URL (http://localhost:11434) is used when not specified
  • Invalid JSON: Tests handling of malformed API responses
  • Streaming Progress: Tests progress bar updates during download

All tests use mocked HTTP servers, so they don't require actual Ollama installation or large model downloads.

Coverage: 11.8% of cmd/pgedge-pg-mcp-svr (focused on testable units)

Running Tests

Run All Tests

go test ./...

Run Specific Package Tests

# Configuration tests
go test ./internal/config -v

# LLM client tests
go test ./internal/llm -v

# Model management tests
go test ./cmd/pgedge-pg-mcp-svr -v

Run Ollama-Specific Tests

# All Ollama tests
go test ./internal/llm -v -run ".*Ollama"

# Model pull tests
go test ./cmd/pgedge-pg-mcp-svr -v -run "TestPullModel"

Run with Coverage

go test ./... -coverprofile=coverage.out
go tool cover -html=coverage.out  # View in browser
go tool cover -func=coverage.out  # View in terminal

Test Design Philosophy

Mock-Based Testing

All LLM and model pull tests use httptest.NewServer() to create mock HTTP servers. This approach:

  • No External Dependencies: Tests run without requiring Ollama installation or actual API keys
  • Fast Execution: No network requests or large downloads
  • Deterministic: Tests produce consistent results regardless of external service availability
  • CI/CD Friendly: No need to download multi-gigabyte models in CI pipelines

Provider Abstraction Testing

Tests verify that:

  1. Both providers (Anthropic and Ollama) work through the same interface
  2. Configuration correctly routes to the appropriate provider
  3. Error handling is consistent across providers
  4. SQL cleaning works for both provider response formats

Edge Cases Covered

  • Missing configuration (API keys, models, URLs)
  • Network failures and timeouts
  • Invalid JSON responses
  • Empty or malformed API responses
  • SQL with markdown formatting, comments, and extra characters
  • Complex multi-line SQL queries

Integration Testing (Optional)

PostgreSQL Resource Integration Tests

Comprehensive integration tests for MCP resources are available in internal/resources/integration_test.go. These tests verify:

  • Version Compatibility: Tests against PostgreSQL 14-17
  • Query Execution: All resources execute successfully
  • NULL Handling: TOAST columns handle NULL values correctly
  • Response Structure: All resources return consistent JSON structure
  • Schema Changes: Version-specific behavior (e.g., pg_stat_bgwriter vs pg_stat_checkpointer)

Running Resource Integration Tests

# 1. Set up PostgreSQL connection for tests
export TEST_PGEDGE_POSTGRES_CONNECTION_STRING="postgres://user:pass@localhost/dbname"

# 2. Run all integration tests
go test ./internal/resources -v -run "Integration"

# 3. Run specific compatibility tests
go test ./internal/resources -v -run "TestVersionAwareResources_Compatibility"

# 4. Test NULL handling
go test ./internal/resources -v -run "TestNullHandling_TOAST"

# 5. Test response structure
go test ./internal/resources -v -run "TestResourceResponseStructure"

Testing Against Multiple PostgreSQL Versions

To ensure compatibility across versions:

# PostgreSQL 14
docker run -d -p 5432:5432 -e POSTGRES_PASSWORD=test postgres:14
export TEST_PGEDGE_POSTGRES_CONNECTION_STRING="postgres://postgres:test@localhost/postgres"
go test ./internal/resources -v -run "Integration"

# PostgreSQL 15
docker run -d -p 5432:5432 -e POSTGRES_PASSWORD=test postgres:15
export TEST_PGEDGE_POSTGRES_CONNECTION_STRING="postgres://postgres:test@localhost/postgres"
go test ./internal/resources -v -run "Integration"

# PostgreSQL 16
docker run -d -p 5432:5432 -e POSTGRES_PASSWORD=test postgres:16
export TEST_PGEDGE_POSTGRES_CONNECTION_STRING="postgres://postgres:test@localhost/postgres"
go test ./internal/resources -v -run "Integration"

# PostgreSQL 17
docker run -d -p 5432:5432 -e POSTGRES_PASSWORD=test postgres:17
export TEST_PGEDGE_POSTGRES_CONNECTION_STRING="postgres://postgres:test@localhost/postgres"
go test ./internal/resources -v -run "Integration"

What's Tested

The integration test suite covers:

  1. Resources:

    • pg://settings
    • pg://system_info
    • pg://stat/activity
    • pg://stat/replication
  2. Edge Cases:

    • Empty result sets
    • Missing or unavailable views

Expected Output

=== RUN   TestAllResources_Integration
Testing against PostgreSQL version 17
=== RUN   TestAllResources_Integration/pg://settings
    integration_test.go:XX: ✓ pg://settings: Successfully executed and returned valid JSON
=== RUN   TestAllResources_Integration/pg://stat/activity
    integration_test.go:XX: ✓ pg://stat/activity: Successfully executed and returned valid JSON
...
PASS

LLM Provider Integration Tests

For testing with actual Ollama models (not run in CI):

# 1. Install Ollama
# Visit https://ollama.ai/

# 2. Start Ollama
ollama serve

# 3. Pull a small model (optional, ~600MB)
ollama pull tinyllama:1.1b

# 4. Test with a real database connection
export LLM_PROVIDER="ollama"
export OLLAMA_MODEL="tinyllama:1.1b"
# Configure database connection via environment variables before starting
export PGHOST="localhost"
export PGPORT="5432"
export PGDATABASE="mydb"
export PGUSER="myuser"
export PGPASSWORD="mypass"
./bin/pgedge-postgres-mcp

Note: Integration tests with actual models are not automated in the test suite to avoid large downloads and external dependencies.

Test Coverage Summary

Package Coverage Focus Areas
internal/config 80.7% Provider config, validation, priority
internal/llm 88.7% Both providers, SQL conversion, error handling
internal/auth 57.3% Token management
internal/tools 48.5% MCP tools
internal/database 38.4% DB connections
Overall 36.9% Core business logic well-covered

Adding New Tests

For New LLM Providers

If adding a new provider (e.g., "openai"), add tests in internal/llm/client_test.go:

func TestConvertNLToSQL_NewProvider_Success(t *testing.T) {
    server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
        // Mock provider's API response
    }))
    defer server.Close()

    client := &Client{
        provider: "newprovider",
        apiKey:   "test-key",
        baseURL:  server.URL,
        model:    "test-model",
    }

    result, err := client.ConvertNLToSQL("test query", "schema")
    // Assertions...
}

For Configuration Changes

Add tests in internal/config/config_test.go to verify:

  • New configuration fields load correctly
  • Validation catches missing required fields
  • Priority system works with new fields

Continuous Integration

All tests run automatically in CI/CD pipelines. The mock-based design ensures:

  • No external service dependencies
  • Fast execution (< 5 seconds for full suite)
  • No large downloads or storage requirements
  • Consistent results across environments