mirror of
https://github.com/coleam00/context-engineering-intro.git
synced 2025-12-17 09:45:23 +00:00
MCP Server Example with PRPs
This commit is contained in:
parent
73d08d2236
commit
8445f05b67
100
use-cases/mcp-server/.claude/commands/prp-mcp-create.md
Normal file
100
use-cases/mcp-server/.claude/commands/prp-mcp-create.md
Normal file
@ -0,0 +1,100 @@
|
||||
---
|
||||
name: "prp-mcp-create"
|
||||
description: This command is designed to create a comprehensive Product Requirement Prompt (PRP) for building Model Context Protocol (MCP) servers referencing this codebase patterns mirroring tool setups for the users specific requirements.
|
||||
Usage: /prp-mcp-create path/to/prp.md
|
||||
Example usage: /prp-mcp-create weather-server "MCP server for weather data with API integration"
|
||||
Example usage: /prp-mcp-create file-manager "MCP server mirroring task master mcp"
|
||||
```
|
||||
---
|
||||
|
||||
# Create MCP Server PRP
|
||||
|
||||
Create a comprehensive Product Requirement Prompt (PRP) for building Model Context Protocol (MCP) servers with authentication, database integration, and Cloudflare Workers deployment.
|
||||
|
||||
Before you start ensure that you read these key files to get an understanding about the goal of the PRP:
|
||||
PRPs/README.md
|
||||
PRPs/templates/prp_mcp_base.md (This base PRP is already partially filled out based on the project structure but please finish it specific to the user's use case for an MCP server)
|
||||
|
||||
## Users MCP use case: $ARGUMENTS
|
||||
|
||||
## Purpose
|
||||
|
||||
Generate context-rich PRPs specifically designed for MCP server development, using the proven patterns in this codebase that is a scaffolding of a MCP server setup that the user can build upon, including GitHub OAuth, and production-ready Cloudflare Workers deployment.
|
||||
|
||||
None of the existing tools will likely be reused and the tools should be created for the users use case specifically tailored to their needs.
|
||||
|
||||
## Execution Process
|
||||
|
||||
1. **Research & Context Gathering**
|
||||
- Create clear todos and spawn subagents to search the codebase for similar features/patterns Think hard and plan your approach
|
||||
- Gather relevant documentation about MCP tools, resources, and authentication flows
|
||||
- Research existing tool patterns to understand how to build the users specified use case
|
||||
- Study existing integration patterns in the codebase
|
||||
|
||||
2. **Generate Comprehensive PRP**
|
||||
- Use the specialized `PRPs/templates/prp_mcp_base.md` template as the foundation
|
||||
- Customize the template with specific server requirements and functionality
|
||||
- Include all necessary context from the codebase patterns and ai_docs
|
||||
- Add specific validation loops for MCP server development
|
||||
- Include database integration patterns and security considerations
|
||||
|
||||
3. **Enhance with AI docs**
|
||||
- The use might have added docs in PRPs/ai_docs/ directory that you should read
|
||||
- If there are docs in the PRPs/ai_docs/ directory, review them and take them into context as you build the PRP
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### PRP Structure for MCP Servers
|
||||
|
||||
The generated PRP uses the specialized template `PRPs/templates/prp_mcp_base.md` and includes:
|
||||
|
||||
- **Goal**: Clear description of the MCP server to be built with authentication and database integration
|
||||
- **Context**: All necessary documentation including PRPs/ai_docs/ references and existing codebase patterns
|
||||
- **Implementation Blueprint**: Step-by-step TypeScript tasks following Cloudflare Workers patterns
|
||||
- **Validation Loop**: Comprehensive MCP-specific testing from compilation to production deployment
|
||||
- **Security Considerations**: GitHub OAuth flows, database access patterns, and SQL injection protection
|
||||
|
||||
### Key Features
|
||||
|
||||
- **Context-Rich**: Includes all patterns and references using relative paths from this proven codebase
|
||||
- **Validation-Driven**: Multi-level validation from syntax to production deployment
|
||||
- **Security-First**: Built-in authentication and authorization patterns
|
||||
- **Production-Ready**: Cloudflare Workers deployment and monitoring
|
||||
|
||||
### Research Areas
|
||||
|
||||
1. **MCP Protocol Patterns**
|
||||
- Tool registration and validation
|
||||
- Resource serving and caching
|
||||
- Error handling and logging
|
||||
- Client communication patterns
|
||||
|
||||
2. **Authentication Integration**
|
||||
- GitHub OAuth implementation
|
||||
- User permission systems
|
||||
- Token management and validation
|
||||
- Session handling patterns
|
||||
|
||||
## Output
|
||||
|
||||
Creates a comprehensive PRP file in the PRPs/ directory with:
|
||||
|
||||
- All necessary context and code patterns
|
||||
- Step-by-step implementation tasks
|
||||
- Validation loops for MCP server development
|
||||
|
||||
## Validation
|
||||
|
||||
The command ensures:
|
||||
|
||||
- All referenced code patterns exist in the codebase
|
||||
- Documentation links are valid and accessible
|
||||
- Implementation tasks are specific and actionable
|
||||
- Validation loops are comprehensive and executable by claude code (IMPORTANT)
|
||||
|
||||
## Integration with Existing Patterns
|
||||
|
||||
- Uses specialized MCP template from `PRPs/templates/prp_mcp_base.md`
|
||||
- Follows the established directory structure and naming conventions
|
||||
- Integrates with existing validation patterns and tools
|
||||
- Leverages proven patterns from the current MCP server implementation in `src/`
|
||||
44
use-cases/mcp-server/.claude/commands/prp-mcp-execute.md
Normal file
44
use-cases/mcp-server/.claude/commands/prp-mcp-execute.md
Normal file
@ -0,0 +1,44 @@
|
||||
---
|
||||
name: "prp-mcp-execute"
|
||||
description: This command is designed to create a comprehensive Model Context Protocol (MCP) servers following the specific Product Requirement Prompt (PRP) passed as an argument, referencing this codebase patterns mirroring tool setups for the users specific requirements.
|
||||
Usage: /prp-mcp-execute path/to/prp.md
|
||||
---
|
||||
|
||||
# Execute MCP Server PRP
|
||||
|
||||
Execute a comprehensive Product Requirement Prompt (PRP) for building Model Context Protocol (MCP) servers with authentication, database integration, and Cloudflare Workers deployment.
|
||||
|
||||
PRP to execute: $ARGUMENTS
|
||||
|
||||
## Purpose
|
||||
|
||||
Execute MCP server PRPs with comprehensive validation, testing, and deployment verification following the proven patterns from this codebase.
|
||||
|
||||
## Execution Process
|
||||
|
||||
1. **Load & Analyze PRP**
|
||||
- Read the specified PRP file completely
|
||||
- Understand all context, requirements, and validation criteria
|
||||
- Create comprehensive todo list using TodoWrite tool
|
||||
- Identify all dependencies and integration points
|
||||
|
||||
2. **Context Gathering & Research**
|
||||
- Use Task agents to research existing MCP server patterns
|
||||
- Study authentication flows and database integration patterns
|
||||
- Research Cloudflare Workers deployment and environment setup
|
||||
- Gather all necessary documentation and code examples
|
||||
|
||||
3. **Implementation Phase**
|
||||
- Execute all implementation tasks in the correct order
|
||||
- Follow TypeScript patterns from the existing codebase
|
||||
- Implement MCP tools, resources, and authentication flows
|
||||
- Add comprehensive error handling and logging
|
||||
|
||||
## Notes
|
||||
|
||||
- Uses TodoWrite tool for comprehensive task management
|
||||
- Follows all patterns from the proven codebase implementation
|
||||
- Includes comprehensive error handling and recovery
|
||||
- Optimized for Claude Code's validation loops
|
||||
- Production-ready with monitoring and logging
|
||||
- Compatible with MCP Inspector and Claude Desktop
|
||||
12
use-cases/mcp-server/.claude/settings.local.json
Normal file
12
use-cases/mcp-server/.claude/settings.local.json
Normal file
@ -0,0 +1,12 @@
|
||||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
"Bash(mkdir:*)",
|
||||
"Bash(mv:*)",
|
||||
"Bash(npm run type-check:*)",
|
||||
"Bash(npx tsc:*)",
|
||||
"Bash(npm test)"
|
||||
],
|
||||
"deny": []
|
||||
}
|
||||
}
|
||||
22
use-cases/mcp-server/.dev.vars.example
Normal file
22
use-cases/mcp-server/.dev.vars.example
Normal file
@ -0,0 +1,22 @@
|
||||
GITHUB_CLIENT_ID=<your github client id>
|
||||
GITHUB_CLIENT_SECRET=<your github client secret>
|
||||
COOKIE_ENCRYPTION_KEY=<your cookie cookie encryption key>
|
||||
|
||||
# Add your Anthropic API key below for PRP parsing functionality
|
||||
# ANTHROPIC_API_KEY=<your Anthropic API key>
|
||||
|
||||
# Optional: Override the default Anthropic model (defaults to claude-3-5-haiku-latest)
|
||||
# ANTHROPIC_MODEL=claude-3-5-haiku-latest
|
||||
|
||||
# Database Connection String
|
||||
# This should be a PostgreSQL connection string with full read/write permissions
|
||||
# Format: postgresql://username:password@hostname:port/database_name
|
||||
# Example: postgresql://user:password@localhost:5432/mydb
|
||||
# For production, use Hyperdrive: https://developers.cloudflare.com/hyperdrive/
|
||||
DATABASE_URL=postgresql://username:password@localhost:5432/database_name
|
||||
|
||||
# Optional: Add Sentry DSN for local development monitoring
|
||||
# Get your DSN from https://sentry.io/settings/projects/your-project/keys/
|
||||
# Create a new project in Sentry, then for the platform pick Cloudflare Workers (search in the top right)
|
||||
SENTRY_DSN=https://your-sentry-dsn@sentry.io/project-id
|
||||
NODE_ENV=development
|
||||
174
use-cases/mcp-server/.gitignore
vendored
Normal file
174
use-cases/mcp-server/.gitignore
vendored
Normal file
@ -0,0 +1,174 @@
|
||||
# Logs
|
||||
|
||||
logs
|
||||
_.log
|
||||
npm-debug.log_
|
||||
yarn-debug.log*
|
||||
yarn-error.log*
|
||||
lerna-debug.log*
|
||||
.pnpm-debug.log*
|
||||
|
||||
# Diagnostic reports (https://nodejs.org/api/report.html)
|
||||
|
||||
report.[0-9]_.[0-9]_.[0-9]_.[0-9]_.json
|
||||
|
||||
# Runtime data
|
||||
|
||||
pids
|
||||
_.pid
|
||||
_.seed
|
||||
\*.pid.lock
|
||||
|
||||
# Directory for instrumented libs generated by jscoverage/JSCover
|
||||
|
||||
lib-cov
|
||||
|
||||
# Coverage directory used by tools like istanbul
|
||||
|
||||
coverage
|
||||
\*.lcov
|
||||
|
||||
# nyc test coverage
|
||||
|
||||
.nyc_output
|
||||
|
||||
# Grunt intermediate storage (https://gruntjs.com/creating-plugins#storing-task-files)
|
||||
|
||||
.grunt
|
||||
|
||||
# Bower dependency directory (https://bower.io/)
|
||||
|
||||
bower_components
|
||||
|
||||
# node-waf configuration
|
||||
|
||||
.lock-wscript
|
||||
|
||||
# Compiled binary addons (https://nodejs.org/api/addons.html)
|
||||
|
||||
build/Release
|
||||
|
||||
# Dependency directories
|
||||
|
||||
node_modules/
|
||||
jspm_packages/
|
||||
|
||||
# Snowpack dependency directory (https://snowpack.dev/)
|
||||
|
||||
web_modules/
|
||||
|
||||
# TypeScript cache
|
||||
|
||||
\*.tsbuildinfo
|
||||
|
||||
# Optional npm cache directory
|
||||
|
||||
.npm
|
||||
|
||||
# Optional eslint cache
|
||||
|
||||
.eslintcache
|
||||
|
||||
# Optional stylelint cache
|
||||
|
||||
.stylelintcache
|
||||
|
||||
# Microbundle cache
|
||||
|
||||
.rpt2_cache/
|
||||
.rts2_cache_cjs/
|
||||
.rts2_cache_es/
|
||||
.rts2_cache_umd/
|
||||
|
||||
# Optional REPL history
|
||||
|
||||
.node_repl_history
|
||||
|
||||
# Output of 'npm pack'
|
||||
|
||||
\*.tgz
|
||||
|
||||
# Yarn Integrity file
|
||||
|
||||
.yarn-integrity
|
||||
|
||||
# dotenv environment variable files
|
||||
|
||||
.env
|
||||
.env.development.local
|
||||
.env.test.local
|
||||
.env.production.local
|
||||
.env.local
|
||||
|
||||
# parcel-bundler cache (https://parceljs.org/)
|
||||
|
||||
.cache
|
||||
.parcel-cache
|
||||
|
||||
# Next.js build output
|
||||
|
||||
.next
|
||||
out
|
||||
|
||||
# Nuxt.js build / generate output
|
||||
|
||||
.nuxt
|
||||
dist
|
||||
|
||||
# Gatsby files
|
||||
|
||||
.cache/
|
||||
|
||||
# Comment in the public line in if your project uses Gatsby and not Next.js
|
||||
|
||||
# https://nextjs.org/blog/next-9-1#public-directory-support
|
||||
|
||||
# public
|
||||
|
||||
# vuepress build output
|
||||
|
||||
.vuepress/dist
|
||||
|
||||
# vuepress v2.x temp and cache directory
|
||||
|
||||
.temp
|
||||
.cache
|
||||
|
||||
# Docusaurus cache and generated files
|
||||
|
||||
.docusaurus
|
||||
|
||||
# Serverless directories
|
||||
|
||||
.serverless/
|
||||
|
||||
# FuseBox cache
|
||||
|
||||
.fusebox/
|
||||
|
||||
# DynamoDB Local files
|
||||
|
||||
.dynamodb/
|
||||
|
||||
# TernJS port file
|
||||
|
||||
.tern-port
|
||||
|
||||
# Stores VSCode versions used for testing VSCode extensions
|
||||
|
||||
.vscode-test
|
||||
|
||||
# yarn v2
|
||||
|
||||
.yarn/cache
|
||||
.yarn/unplugged
|
||||
.yarn/build-state.yml
|
||||
.yarn/install-state.gz
|
||||
.pnp.\*
|
||||
|
||||
# wrangler project
|
||||
|
||||
.dev.vars
|
||||
.wrangler/
|
||||
|
||||
# Extra folders
|
||||
14
use-cases/mcp-server/.prettierrc
Normal file
14
use-cases/mcp-server/.prettierrc
Normal file
@ -0,0 +1,14 @@
|
||||
{
|
||||
"printWidth": 140,
|
||||
"singleQuote": false,
|
||||
"semi": true,
|
||||
"useTabs": false,
|
||||
"overrides": [
|
||||
{
|
||||
"files": ["*.jsonc"],
|
||||
"options": {
|
||||
"trailingComma": "none"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
835
use-cases/mcp-server/CLAUDE.md
Normal file
835
use-cases/mcp-server/CLAUDE.md
Normal file
@ -0,0 +1,835 @@
|
||||
# MCP Server with GitHub OAuth - Implementation Guide
|
||||
|
||||
This guide provides implementation patterns and standards for building MCP (Model Context Protocol) servers with GitHub OAuth authentication using Node.js, TypeScript, and Cloudflare Workers. For WHAT to build, see the PRP (Product Requirement Prompt) documents.
|
||||
|
||||
## Core Principles
|
||||
|
||||
**IMPORTANT: You MUST follow these principles in all code changes and PRP generations:**
|
||||
|
||||
### KISS (Keep It Simple, Stupid)
|
||||
|
||||
- Simplicity should be a key goal in design
|
||||
- Choose straightforward solutions over complex ones whenever possible
|
||||
- Simple solutions are easier to understand, maintain, and debug
|
||||
|
||||
### YAGNI (You Aren't Gonna Need It)
|
||||
|
||||
- Avoid building functionality on speculation
|
||||
- Implement features only when they are needed, not when you anticipate they might be useful in the future
|
||||
|
||||
### Open/Closed Principle
|
||||
|
||||
- Software entities should be open for extension but closed for modification
|
||||
- Design systems so that new functionality can be added with minimal changes to existing code
|
||||
|
||||
## Package Management & Tooling
|
||||
|
||||
**CRITICAL: This project uses npm for Node.js package management and Wrangler CLI for Cloudflare Workers development.**
|
||||
|
||||
### Essential npm Commands
|
||||
|
||||
```bash
|
||||
# Install dependencies from package.json
|
||||
npm install
|
||||
|
||||
# Add a dependency
|
||||
npm install package-name
|
||||
|
||||
# Add a development dependency
|
||||
npm install --save-dev package-name
|
||||
|
||||
# Remove a package
|
||||
npm uninstall package-name
|
||||
|
||||
# Update dependencies
|
||||
npm update
|
||||
|
||||
# Run scripts defined in package.json
|
||||
npm run dev
|
||||
npm run deploy
|
||||
npm run type-check
|
||||
```
|
||||
|
||||
### Essential Wrangler CLI Commands
|
||||
|
||||
**CRITICAL: Use Wrangler CLI for all Cloudflare Workers development, testing, and deployment.**
|
||||
|
||||
```bash
|
||||
# Authentication
|
||||
wrangler login # Login to Cloudflare account
|
||||
wrangler logout # Logout from Cloudflare
|
||||
wrangler whoami # Check current user
|
||||
|
||||
# Development & Testing
|
||||
wrangler dev # Start local development server (default port 8787)
|
||||
|
||||
# Deployment
|
||||
wrangler deploy # Deploy Worker to Cloudflare
|
||||
wrangler deploy --dry-run # Test deployment without actually deploying
|
||||
|
||||
# Configuration & Types
|
||||
wrangler types # Generate TypeScript types from Worker configuration
|
||||
```
|
||||
|
||||
## Project Architecture
|
||||
|
||||
**IMPORTANT: This is a Cloudflare Workers MCP server with GitHub OAuth authentication for secure database access.**
|
||||
|
||||
### Current Project Structure
|
||||
|
||||
```
|
||||
/
|
||||
├── src/ # TypeScript source code
|
||||
│ ├── index.ts # Main MCP server (standard)
|
||||
│ ├── index_sentry.ts # Sentry-enabled MCP server
|
||||
│ ├── simple-math.ts # Basic MCP example (no auth)
|
||||
│ ├── github-handler.ts # GitHub OAuth flow implementation
|
||||
│ ├── database.ts # PostgreSQL connection & utilities
|
||||
│ ├── utils.ts # OAuth helper functions
|
||||
│ ├── workers-oauth-utils.ts # Cookie-based approval system
|
||||
│ └── tools/ # Tool registration system
|
||||
│ └── register-tools.ts # Centralized tool registration
|
||||
├── PRPs/ # Product Requirement Prompts
|
||||
│ ├── README.md
|
||||
│ └── templates/
|
||||
│ └── prp_mcp_base.md
|
||||
├── examples/ # Example tool creation + registration - NEVER edit or import from this folder
|
||||
│ ├── database-tools.ts # Example tools for a Postgres MCP server showing best practices for tool creation and registration
|
||||
│ └── database-tools-sentry.ts # Example tools for the Postgres MCP server but with the Sentry integration for production monitoring
|
||||
├── wrangler.jsonc # Main Cloudflare Workers configuration
|
||||
├── wrangler-simple.jsonc # Simple math example configuration
|
||||
├── package.json # npm dependencies & scripts
|
||||
├── tsconfig.json # TypeScript configuration
|
||||
├── worker-configuration.d.ts # Generated Cloudflare types
|
||||
└── CLAUDE.md # This implementation guide
|
||||
```
|
||||
|
||||
### Key File Purposes (ALWAYS ADD NEW FILES HERE)
|
||||
|
||||
**Main Implementation Files:**
|
||||
|
||||
- `src/index.ts` - Production MCP server with GitHub OAuth + PostgreSQL
|
||||
- `src/index_sentry.ts` - Same as above with Sentry monitoring integration
|
||||
|
||||
**Authentication & Security:**
|
||||
|
||||
- `src/github-handler.ts` - Complete GitHub OAuth 2.0 flow
|
||||
- `src/workers-oauth-utils.ts` - HMAC-signed cookie approval system
|
||||
- `src/utils.ts` - OAuth token exchange and URL construction helpers
|
||||
|
||||
**Database Integration:**
|
||||
|
||||
- `src/database.ts` - PostgreSQL connection pooling, SQL validation, security
|
||||
|
||||
**Tool Registration:**
|
||||
|
||||
- `src/tools/register-tools.ts` - Centralized tool registration system that imports and registers all tools
|
||||
|
||||
**Configuration Files:**
|
||||
|
||||
- `wrangler.jsonc` - Main Worker config with Durable Objects, KV, AI bindings
|
||||
- `wrangler-simple.jsonc` - Simple example configuration
|
||||
- `tsconfig.json` - TypeScript compiler settings for Cloudflare Workers
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Core Workflow Commands
|
||||
|
||||
```bash
|
||||
# Setup & Dependencies
|
||||
npm install # Install all dependencies
|
||||
npm install --save-dev @types/package # Add dev dependency with types
|
||||
|
||||
# Development
|
||||
wrangler dev # Start local development server
|
||||
npm run dev # Alternative via npm script
|
||||
|
||||
# Type Checking & Validation
|
||||
npm run type-check # Run TypeScript compiler check
|
||||
wrangler types # Generate Cloudflare Worker types
|
||||
npx tsc --noEmit # Type check without compiling
|
||||
|
||||
# Testing
|
||||
npx vitest # Run unit tests (if configured)
|
||||
|
||||
# Code Quality
|
||||
npx prettier --write . # Format code
|
||||
npx eslint src/ # Lint TypeScript code
|
||||
```
|
||||
|
||||
### Environment Configuration
|
||||
|
||||
**Environment Variables Setup:**
|
||||
|
||||
```bash
|
||||
# Create .dev.vars file for local development based on .dev.vars.example
|
||||
cp .dev.vars.example .dev.vars
|
||||
|
||||
# Production secrets (via Wrangler)
|
||||
wrangler secret put GITHUB_CLIENT_ID
|
||||
wrangler secret put GITHUB_CLIENT_SECRET
|
||||
wrangler secret put COOKIE_ENCRYPTION_KEY
|
||||
wrangler secret put DATABASE_URL
|
||||
wrangler secret put SENTRY_DSN
|
||||
```
|
||||
|
||||
## MCP Development Context
|
||||
|
||||
**IMPORTANT: This project builds production-ready MCP servers using Node.js/TypeScript on Cloudflare Workers with GitHub OAuth authentication.**
|
||||
|
||||
### MCP Technology Stack
|
||||
|
||||
**Core Technologies:**
|
||||
|
||||
- **@modelcontextprotocol/sdk** - Official MCP TypeScript SDK
|
||||
- **agents/mcp** - Cloudflare Workers MCP agent framework
|
||||
- **workers-mcp** - MCP transport layer for Workers
|
||||
- **@cloudflare/workers-oauth-provider** - OAuth 2.1 server implementation
|
||||
|
||||
**Cloudflare Platform:**
|
||||
|
||||
- **Cloudflare Workers** - Serverless runtime (V8 isolates)
|
||||
- **Durable Objects** - Stateful objects for MCP agent persistence
|
||||
- **KV Storage** - OAuth state and session management
|
||||
|
||||
### MCP Server Architecture
|
||||
|
||||
This project implements MCP servers as Cloudflare Workers with three main patterns:
|
||||
|
||||
**1. Authenticated Database MCP Server (`src/index.ts`):**
|
||||
|
||||
```typescript
|
||||
export class MyMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
server = new McpServer({
|
||||
name: "PostgreSQL Database MCP Server",
|
||||
version: "1.0.0",
|
||||
});
|
||||
|
||||
// MCP Tools available based on user permissions
|
||||
// - listTables (all users)
|
||||
// - queryDatabase (all users, read-only)
|
||||
// - executeDatabase (privileged users only)
|
||||
}
|
||||
```
|
||||
|
||||
**2. Monitored MCP Server (`src/index_sentry.ts`):**
|
||||
|
||||
- Same functionality as above with Sentry instrumentation
|
||||
- Distributed tracing for MCP tool calls
|
||||
- Error tracking with event IDs
|
||||
- Performance monitoring
|
||||
|
||||
### MCP Development Commands
|
||||
|
||||
**Local Development & Testing:**
|
||||
|
||||
```bash
|
||||
# Start main MCP server (with OAuth)
|
||||
wrangler dev # Available at http://localhost:8792/mcp
|
||||
```
|
||||
|
||||
### Claude Desktop Integration
|
||||
|
||||
**For Local Development:**
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"database-mcp": {
|
||||
"command": "npx",
|
||||
"args": ["mcp-remote", "http://localhost:8792/mcp"],
|
||||
"env": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**For Production Deployment:**
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"database-mcp": {
|
||||
"command": "npx",
|
||||
"args": ["mcp-remote", "https://your-worker.workers.dev/mcp"],
|
||||
"env": {}
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### MCP Key Concepts for This Project
|
||||
|
||||
- **Tools**: Database operations (listTables, queryDatabase, executeDatabase)
|
||||
- **Authentication**: GitHub OAuth with role-based access control
|
||||
- **Transport**: Dual support for HTTP (`/mcp`) and SSE (`/sse`) protocols
|
||||
- **State**: Durable Objects maintain authenticated user context
|
||||
- **Security**: SQL injection protection, permission validation, error sanitization
|
||||
|
||||
## Database Integration & Security
|
||||
|
||||
**CRITICAL: This project provides secure PostgreSQL database access through MCP tools with role-based permissions.**
|
||||
|
||||
### Database Architecture
|
||||
|
||||
**Connection Management (`src/database.ts`):**
|
||||
|
||||
```typescript
|
||||
// Singleton connection pool with Cloudflare Workers limits
|
||||
export function getDb(databaseUrl: string): postgres.Sql {
|
||||
if (!dbInstance) {
|
||||
dbInstance = postgres(databaseUrl, {
|
||||
max: 5, // Max 5 connections for Workers
|
||||
idle_timeout: 20,
|
||||
connect_timeout: 10,
|
||||
prepare: true, // Enable prepared statements
|
||||
});
|
||||
}
|
||||
return dbInstance;
|
||||
}
|
||||
|
||||
// Connection wrapper with error handling
|
||||
export async function withDatabase<T>(databaseUrl: string, operation: (db: postgres.Sql) => Promise<T>): Promise<T> {
|
||||
const db = getDb(databaseUrl);
|
||||
// Execute operation with timing and error handling
|
||||
}
|
||||
```
|
||||
|
||||
### Security Implementation
|
||||
|
||||
**SQL Injection Protection:**
|
||||
|
||||
```typescript
|
||||
export function validateSqlQuery(sql: string): { isValid: boolean; error?: string } {
|
||||
const dangerousPatterns = [
|
||||
/;\s*drop\s+/i,
|
||||
/;\s*delete\s+.*\s+where\s+1\s*=\s*1/i,
|
||||
/;\s*truncate\s+/i,
|
||||
// ... more patterns
|
||||
];
|
||||
// Pattern-based validation for safety
|
||||
}
|
||||
|
||||
export function isWriteOperation(sql: string): boolean {
|
||||
const writeKeywords = ["insert", "update", "delete", "create", "drop", "alter"];
|
||||
return writeKeywords.some((keyword) => sql.trim().toLowerCase().startsWith(keyword));
|
||||
}
|
||||
```
|
||||
|
||||
**Access Control (`src/index.ts`):**
|
||||
|
||||
```typescript
|
||||
const ALLOWED_USERNAMES = new Set<string>([
|
||||
'coleam00' // Only these GitHub usernames can execute write operations
|
||||
]);
|
||||
|
||||
// Tool availability based on user permissions
|
||||
if (ALLOWED_USERNAMES.has(this.props.login)) {
|
||||
// Register executeDatabase tool for privileged users
|
||||
this.server.tool("executeDatabase", ...);
|
||||
}
|
||||
```
|
||||
|
||||
### MCP Tools Implementation
|
||||
|
||||
**Tool Registration System:**
|
||||
|
||||
Tools are now organized in a modular way with centralized registration:
|
||||
|
||||
1. **Tool Registration (`src/tools/register-tools.ts`):**
|
||||
- Central registry that imports all tool modules
|
||||
- Calls individual registration functions
|
||||
- Passes server, environment, and user props to each module
|
||||
|
||||
2. **Tool Implementation Pattern:**
|
||||
- Each feature/domain gets its own tool file (e.g., `database-tools.ts`)
|
||||
- Tools are exported as registration functions
|
||||
- Registration functions receive server instance, environment, and user props
|
||||
- Permission checking happens during registration
|
||||
|
||||
**Example Tool Registration:**
|
||||
|
||||
```typescript
|
||||
// src/tools/register-tools.ts
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { Props } from "../types";
|
||||
import { registerDatabaseTools } from "../../examples/database-tools";
|
||||
|
||||
export function registerAllTools(server: McpServer, env: Env, props: Props) {
|
||||
// Register database tools
|
||||
registerDatabaseTools(server, env, props);
|
||||
|
||||
// Future tools can be registered here
|
||||
// registerAnalyticsTools(server, env, props);
|
||||
// registerReportingTools(server, env, props);
|
||||
}
|
||||
```
|
||||
|
||||
**Example Tool Module (`examples/database-tools.ts`):**
|
||||
|
||||
```typescript
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { Props } from "../types";
|
||||
|
||||
const ALLOWED_USERNAMES = new Set<string>(['coleam00']);
|
||||
|
||||
export function registerDatabaseTools(server: McpServer, env: Env, props: Props) {
|
||||
// Tool 1: Available to all authenticated users
|
||||
server.tool(
|
||||
"listTables",
|
||||
"Get a list of all tables in the database",
|
||||
ListTablesSchema,
|
||||
async () => {
|
||||
// Implementation
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 2: Available to all authenticated users
|
||||
server.tool(
|
||||
"queryDatabase",
|
||||
"Execute a read-only SQL query",
|
||||
QueryDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
// Implementation with validation
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 3: Only for privileged users
|
||||
if (ALLOWED_USERNAMES.has(props.login)) {
|
||||
server.tool(
|
||||
"executeDatabase",
|
||||
"Execute any SQL statement (privileged)",
|
||||
ExecuteDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
// Implementation
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Available Database Tools in examples:**
|
||||
|
||||
1. **`listTables`** - Schema discovery (all authenticated users)
|
||||
2. **`queryDatabase`** - Read-only SQL queries (all authenticated users)
|
||||
3. **`executeDatabase`** - Write operations (privileged users only)
|
||||
|
||||
## GitHub OAuth Implementation
|
||||
|
||||
**CRITICAL: This project implements secure GitHub OAuth 2.0 flow with signed cookie-based approval system.**
|
||||
|
||||
### OAuth Flow Architecture
|
||||
|
||||
**Authentication Flow (`src/github-handler.ts`):**
|
||||
|
||||
```typescript
|
||||
// 1. Authorization Request
|
||||
app.get("/authorize", async (c) => {
|
||||
const oauthReqInfo = await c.env.OAUTH_PROVIDER.parseAuthRequest(c.req.raw);
|
||||
|
||||
// Check if client already approved via signed cookie
|
||||
if (await clientIdAlreadyApproved(c.req.raw, oauthReqInfo.clientId, c.env.COOKIE_ENCRYPTION_KEY)) {
|
||||
return redirectToGithub(c.req.raw, oauthReqInfo, c.env, {});
|
||||
}
|
||||
|
||||
// Show approval dialog
|
||||
return renderApprovalDialog(c.req.raw, { client, server, state });
|
||||
});
|
||||
|
||||
// 2. GitHub Callback
|
||||
app.get("/callback", async (c) => {
|
||||
// Exchange code for access token
|
||||
const [accessToken, errResponse] = await fetchUpstreamAuthToken({
|
||||
client_id: c.env.GITHUB_CLIENT_ID,
|
||||
client_secret: c.env.GITHUB_CLIENT_SECRET,
|
||||
code: c.req.query("code"),
|
||||
redirect_uri: new URL("/callback", c.req.url).href,
|
||||
});
|
||||
|
||||
// Get GitHub user info
|
||||
const user = await new Octokit({ auth: accessToken }).rest.users.getAuthenticated();
|
||||
|
||||
// Complete authorization with user props
|
||||
return c.env.OAUTH_PROVIDER.completeAuthorization({
|
||||
props: { accessToken, email, login, name } as Props,
|
||||
userId: login,
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
### Cookie Security System
|
||||
|
||||
**HMAC-Signed Approval Cookies (`src/workers-oauth-utils.ts`):**
|
||||
|
||||
```typescript
|
||||
// Generate signed cookie for client approval
|
||||
async function signData(key: CryptoKey, data: string): Promise<string> {
|
||||
const signatureBuffer = await crypto.subtle.sign("HMAC", key, enc.encode(data));
|
||||
return Array.from(new Uint8Array(signatureBuffer))
|
||||
.map((b) => b.toString(16).padStart(2, "0"))
|
||||
.join("");
|
||||
}
|
||||
|
||||
// Verify cookie integrity
|
||||
async function verifySignature(key: CryptoKey, signatureHex: string, data: string): Promise<boolean> {
|
||||
const signatureBytes = new Uint8Array(signatureHex.match(/.{1,2}/g)!.map((byte) => parseInt(byte, 16)));
|
||||
return await crypto.subtle.verify("HMAC", key, signatureBytes.buffer, enc.encode(data));
|
||||
}
|
||||
```
|
||||
|
||||
### User Context & Permissions
|
||||
|
||||
**Authenticated User Props:**
|
||||
|
||||
```typescript
|
||||
type Props = {
|
||||
login: string; // GitHub username
|
||||
name: string; // Display name
|
||||
email: string; // Email address
|
||||
accessToken: string; // GitHub access token
|
||||
};
|
||||
|
||||
// Available in MCP tools via this.props
|
||||
class MyMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
async init() {
|
||||
// Access user context in any tool
|
||||
const username = this.props.login;
|
||||
const hasWriteAccess = ALLOWED_USERNAMES.has(username);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Monitoring & Observability
|
||||
|
||||
**CRITICAL: This project supports optional Sentry integration for production monitoring and includes built-in console logging.**
|
||||
|
||||
### Logging Architecture
|
||||
|
||||
**Two Deployment Options:**
|
||||
|
||||
1. **Standard Version (`src/index.ts`)**: Console logging only
|
||||
2. **Sentry Version (`src/index_sentry.ts`)**: Full Sentry instrumentation
|
||||
|
||||
### Sentry Integration (Optional)
|
||||
|
||||
**Enable Sentry Monitoring:**
|
||||
|
||||
```typescript
|
||||
// src/index_sentry.ts - Production-ready with monitoring
|
||||
import * as Sentry from "@sentry/cloudflare";
|
||||
|
||||
// Sentry configuration
|
||||
function getSentryConfig(env: Env) {
|
||||
return {
|
||||
dsn: env.SENTRY_DSN,
|
||||
tracesSampleRate: 1, // 100% trace sampling
|
||||
};
|
||||
}
|
||||
|
||||
// Instrument MCP tools with tracing
|
||||
private registerTool(name: string, description: string, schema: any, handler: any) {
|
||||
this.server.tool(name, description, schema, async (args: any) => {
|
||||
return await Sentry.startNewTrace(async () => {
|
||||
return await Sentry.startSpan({
|
||||
name: `mcp.tool/${name}`,
|
||||
attributes: extractMcpParameters(args),
|
||||
}, async (span) => {
|
||||
// Set user context
|
||||
Sentry.setUser({
|
||||
username: this.props.login,
|
||||
email: this.props.email,
|
||||
});
|
||||
|
||||
try {
|
||||
return await handler(args);
|
||||
} catch (error) {
|
||||
span.setStatus({ code: 2 }); // error
|
||||
return handleError(error); // Returns user-friendly error with event ID
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
}
|
||||
```
|
||||
|
||||
**Sentry Features Enabled:**
|
||||
|
||||
- **Error Tracking**: Automatic exception capture with context
|
||||
- **Performance Monitoring**: Full request tracing with 100% sample rate
|
||||
- **User Context**: GitHub user information bound to events
|
||||
- **Tool Tracing**: Each MCP tool call traced with parameters
|
||||
- **Distributed Tracing**: Request flow across Cloudflare Workers
|
||||
|
||||
### Production Logging Patterns
|
||||
|
||||
**Console Logging (Standard):**
|
||||
|
||||
```typescript
|
||||
// Database operations
|
||||
console.log(`Database operation completed successfully in ${duration}ms`);
|
||||
console.error(`Database operation failed after ${duration}ms:`, error);
|
||||
|
||||
// Authentication events
|
||||
console.log(`User authenticated: ${this.props.login} (${this.props.name})`);
|
||||
|
||||
// Tool execution
|
||||
console.log(`Tool called: ${toolName} by ${this.props.login}`);
|
||||
console.error(`Tool failed: ${toolName}`, error);
|
||||
```
|
||||
|
||||
**Structured Error Handling:**
|
||||
|
||||
```typescript
|
||||
// Error sanitization for security
|
||||
export function formatDatabaseError(error: unknown): string {
|
||||
if (error instanceof Error) {
|
||||
if (error.message.includes("password")) {
|
||||
return "Database authentication failed. Please check credentials.";
|
||||
}
|
||||
if (error.message.includes("timeout")) {
|
||||
return "Database connection timed out. Please try again.";
|
||||
}
|
||||
return `Database error: ${error.message}`;
|
||||
}
|
||||
return "Unknown database error occurred.";
|
||||
}
|
||||
```
|
||||
|
||||
### Monitoring Configuration
|
||||
|
||||
**Development Monitoring:**
|
||||
|
||||
```bash
|
||||
# Enable Sentry in development
|
||||
echo 'SENTRY_DSN=https://your-dsn@sentry.io/project' >> .dev.vars
|
||||
echo 'NODE_ENV=development' >> .dev.vars
|
||||
|
||||
# Use Sentry-enabled version
|
||||
wrangler dev --config wrangler.jsonc # Ensure main = "src/index_sentry.ts"
|
||||
```
|
||||
|
||||
**Production Monitoring:**
|
||||
|
||||
```bash
|
||||
# Set production secrets
|
||||
wrangler secret put SENTRY_DSN
|
||||
wrangler secret put NODE_ENV # Set to "production"
|
||||
|
||||
# Deploy with monitoring
|
||||
wrangler deploy
|
||||
```
|
||||
|
||||
## TypeScript Development Standards
|
||||
|
||||
**CRITICAL: All MCP tools MUST follow TypeScript best practices with Zod validation and proper error handling.**
|
||||
|
||||
### Standard Response Format
|
||||
|
||||
**ALL tools MUST return MCP-compatible response objects:**
|
||||
|
||||
```typescript
|
||||
import { z } from "zod";
|
||||
|
||||
// Tool with proper TypeScript patterns
|
||||
this.server.tool(
|
||||
"standardizedTool",
|
||||
"Tool following standard response format",
|
||||
{
|
||||
name: z.string().min(1, "Name cannot be empty"),
|
||||
options: z.object({}).optional(),
|
||||
},
|
||||
async ({ name, options }) => {
|
||||
try {
|
||||
// Input already validated by Zod schema
|
||||
const result = await processName(name, options);
|
||||
|
||||
// Return standardized success response
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Success**\n\nProcessed: ${name}\n\n**Result:**\n\`\`\`json\n${JSON.stringify(result, null, 2)}\n\`\`\`\n\n**Processing time:** 0.5s`,
|
||||
},
|
||||
],
|
||||
};
|
||||
} catch (error) {
|
||||
// Return standardized error response
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Error**\n\nProcessing failed: ${error instanceof Error ? error.message : String(error)}`,
|
||||
isError: true,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
### Input Validation with Zod
|
||||
|
||||
**ALL tool inputs MUST be validated using Zod schemas:**
|
||||
|
||||
```typescript
|
||||
import { z } from "zod";
|
||||
|
||||
// Define validation schemas
|
||||
const DatabaseQuerySchema = z.object({
|
||||
sql: z
|
||||
.string()
|
||||
.min(1, "SQL query cannot be empty")
|
||||
.refine((sql) => sql.trim().toLowerCase().startsWith("select"), {
|
||||
message: "Only SELECT queries are allowed",
|
||||
}),
|
||||
limit: z.number().int().positive().max(1000).optional(),
|
||||
});
|
||||
|
||||
// Use in tool definition
|
||||
this.server.tool(
|
||||
"queryDatabase",
|
||||
"Execute a read-only SQL query",
|
||||
DatabaseQuerySchema, // Zod schema provides automatic validation
|
||||
async ({ sql, limit }) => {
|
||||
// sql and limit are already validated and properly typed
|
||||
const results = await db.unsafe(sql);
|
||||
return { content: [{ type: "text", text: JSON.stringify(results, null, 2) }] };
|
||||
},
|
||||
);
|
||||
```
|
||||
|
||||
### Error Handling Patterns
|
||||
|
||||
**Standardized error responses:**
|
||||
|
||||
```typescript
|
||||
// Error handling utility
|
||||
function createErrorResponse(message: string, details?: any): any {
|
||||
return {
|
||||
content: [{
|
||||
type: "text",
|
||||
text: `**Error**\n\n${message}${details ? `\n\n**Details:**\n\`\`\`json\n${JSON.stringify(details, null, 2)}\n\`\`\`` : ''}`,
|
||||
isError: true
|
||||
}]
|
||||
};
|
||||
}
|
||||
|
||||
// Permission error
|
||||
if (!ALLOWED_USERNAMES.has(this.props.login)) {
|
||||
return createErrorResponse(
|
||||
"Insufficient permissions for this operation",
|
||||
{ requiredRole: "privileged", userRole: "standard" }
|
||||
);
|
||||
}
|
||||
|
||||
// Validation error
|
||||
if (isWriteOperation(sql)) {
|
||||
return createErrorResponse(
|
||||
"Write operations not allowed with this tool",
|
||||
{ operation: "write", allowedOperations: ["select", "show", "describe"] }
|
||||
);
|
||||
}
|
||||
|
||||
// Database error
|
||||
catch (error) {
|
||||
return createErrorResponse(
|
||||
"Database operation failed",
|
||||
{ error: formatDatabaseError(error) }
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
### Type Safety Rules
|
||||
|
||||
**MANDATORY TypeScript patterns:**
|
||||
|
||||
1. **Strict Types**: All parameters and return types explicitly typed
|
||||
2. **Zod Validation**: All inputs validated with Zod schemas
|
||||
3. **Error Handling**: All async operations wrapped in try/catch
|
||||
4. **User Context**: Props typed with GitHub user information
|
||||
5. **Environment**: Cloudflare Workers types generated with `wrangler types`
|
||||
|
||||
## Code Style Preferences
|
||||
|
||||
### TypeScript Style
|
||||
|
||||
- Use explicit type annotations for all function parameters and return types
|
||||
- Use JSDoc comments for all exported functions and classes
|
||||
- Prefer async/await for all asynchronous operations
|
||||
- **MANDATORY**: Use Zod schemas for all input validation
|
||||
- **MANDATORY**: Use proper error handling with try/catch blocks
|
||||
- Keep functions small and focused (single responsibility principle)
|
||||
|
||||
### File Organization
|
||||
|
||||
- Each MCP server should be self-contained in a single TypeScript file
|
||||
- Import statements organized: Node.js built-ins, third-party packages, local imports
|
||||
- Use relative imports within the src/ directory
|
||||
- **Import Zod for validation and proper types for all modules**
|
||||
|
||||
### Testing Conventions
|
||||
|
||||
- Use MCP Inspector for integration testing: `npx @modelcontextprotocol/inspector@latest`
|
||||
- Test with local development server: `wrangler dev`
|
||||
- Use descriptive tool names and descriptions
|
||||
- **Test both authentication and permission scenarios**
|
||||
- **Test input validation with invalid data**
|
||||
|
||||
## Important Notes
|
||||
|
||||
### What NOT to do
|
||||
|
||||
- **NEVER** commit secrets or environment variables to the repository
|
||||
- **NEVER** build complex solutions when simple ones will work
|
||||
- **NEVER** skip input validation with Zod schemas
|
||||
|
||||
### What TO do
|
||||
|
||||
- **ALWAYS** use TypeScript strict mode and proper typing
|
||||
- **ALWAYS** validate inputs with Zod schemas
|
||||
- **ALWAYS** follow the core principles (KISS, YAGNI, etc.)
|
||||
- **ALWAYS** use Wrangler CLI for all development and deployment
|
||||
|
||||
## Git Workflow
|
||||
|
||||
```bash
|
||||
# Before committing, always run:
|
||||
npm run type-check # Ensure TypeScript compiles
|
||||
wrangler dev --dry-run # Test deployment configuration
|
||||
|
||||
# Commit with descriptive messages
|
||||
git add .
|
||||
git commit -m "feat: add new MCP tool for database queries"
|
||||
```
|
||||
|
||||
## Quick Reference
|
||||
|
||||
### Adding New MCP Tools
|
||||
|
||||
1. **Create a new tool module** in your project (following the pattern in `examples/`):
|
||||
```typescript
|
||||
// src/tools/your-feature-tools.ts
|
||||
export function registerYourFeatureTools(server: McpServer, env: Env, props: Props) {
|
||||
// Register your tools here
|
||||
}
|
||||
```
|
||||
|
||||
2. **Define Zod schemas** for input validation in your types file
|
||||
|
||||
3. **Implement tool handlers** with proper error handling using the patterns from examples
|
||||
|
||||
4. **Register your tools** in `src/tools/register-tools.ts`:
|
||||
```typescript
|
||||
import { registerYourFeatureTools } from "./your-feature-tools";
|
||||
|
||||
export function registerAllTools(server: McpServer, env: Env, props: Props) {
|
||||
// Existing registrations
|
||||
registerDatabaseTools(server, env, props);
|
||||
|
||||
// Add your new registration
|
||||
registerYourFeatureTools(server, env, props);
|
||||
}
|
||||
```
|
||||
|
||||
5. **Update documentation** if needed
|
||||
32
use-cases/mcp-server/PRPs/INITIAL.md
Normal file
32
use-cases/mcp-server/PRPs/INITIAL.md
Normal file
@ -0,0 +1,32 @@
|
||||
## FEATURE:
|
||||
|
||||
We want to create a MCP server using this repos template
|
||||
|
||||
The goal of the MCP server is to create a simple version of taskmaster mcp that instead of parsing PRDs we parse PRPs.
|
||||
|
||||
Additional features:
|
||||
|
||||
- LLM powered PRP information extraction using anthropic
|
||||
- Crud operation on tasks, documentation, tags, etc to and from the DB
|
||||
|
||||
We need tools for parsing PRPs this tool should take a filled PRP and use anthropic to extract the tasks into tasks and save them to the db, including surrounding documentation from the prp like the goals what whys, target users, etc.
|
||||
|
||||
We need:
|
||||
|
||||
- To be able to perform CRUD operations on tasks, documentation, tags, etc
|
||||
- A task fetch tool to get the tasks from the
|
||||
- To be able to list all tasks
|
||||
- To be able to add information to a task
|
||||
- To be able to fetch the additional documentation from the db
|
||||
- To be able to modify the additional documentation
|
||||
- DB tables needs to be updated to match our new data models
|
||||
|
||||
## EXAMPLES & DOCUMENTATION:
|
||||
|
||||
All examples are already referenced in prp_mcp_base.md - do any additional research as needed.
|
||||
|
||||
## OTHER CONSIDERATIONS:
|
||||
|
||||
- Do not use complex regex or complex parsing patterns, we use an LLM to parse PRPs.
|
||||
- Model and API key for Anthropic both need to be environment variables - these are set up in .dev.vars.example
|
||||
- It's very important that we create one task per file to keep concerns separate
|
||||
44
use-cases/mcp-server/PRPs/README.md
Normal file
44
use-cases/mcp-server/PRPs/README.md
Normal file
@ -0,0 +1,44 @@
|
||||
# Product Requirement Prompt (PRP) Concept
|
||||
|
||||
"Over-specifying what to build while under-specifying the context, and how to build it, is why so many AI-driven coding attempts stall at 80%. A Product Requirement Prompt (PRP) fixes that by fusing the disciplined scope of a classic Product Requirements Document (PRD) with the “context-is-king” mindset of modern prompt engineering."
|
||||
|
||||
## What is a PRP?
|
||||
|
||||
Product Requirement Prompt (PRP)
|
||||
A PRP is a structured prompt that supplies an AI coding agent with everything it needs to deliver a vertical slice of working software—no more, no less.
|
||||
|
||||
### How it differs from a PRD
|
||||
|
||||
A traditional PRD clarifies what the product must do and why customers need it, but deliberately avoids how it will be built.
|
||||
|
||||
A PRP keeps the goal and justification sections of a PRD yet adds three AI-critical layers:
|
||||
|
||||
### Context
|
||||
|
||||
- Precise file paths and content, library versions and library context, code snippets examples. LLMs generate higher-quality code when given direct, in-prompt references instead of broad descriptions. Usage of a ai_docs/ directory to pipe in library and other docs.
|
||||
|
||||
### Implementation Details and Strategy
|
||||
|
||||
- In contrast of a traditional PRD, a PRP explicitly states how the product will be built. This includes the use of API endpoints, test runners, or agent patterns (ReAct, Plan-and-Execute) to use. Usage of typehints, dependencies, architectural patterns and other tools to ensure the code is built correctly.
|
||||
|
||||
### Validation Gates
|
||||
|
||||
- Deterministic checks such as pytest, ruff, or static type passes “Shift-left” quality controls catch defects early and are cheaper than late re-work.
|
||||
Example: Each new funtion should be individaully tested, Validation gate = all tests pass.
|
||||
|
||||
### PRP Layer Why It Exists
|
||||
|
||||
- The PRP folder is used to prepare and pipe PRPs to the agentic coder.
|
||||
|
||||
## Why context is non-negotiable
|
||||
|
||||
Large-language-model outputs are bounded by their context window; irrelevant or missing context literally squeezes out useful tokens
|
||||
|
||||
The industry mantra “Garbage In → Garbage Out” applies doubly to prompt engineering and especially in agentic engineering: sloppy input yields brittle code
|
||||
|
||||
## In short
|
||||
|
||||
A PRP is PRD + curated codebase intelligence + agent/runbook—the minimum viable packet an AI needs to plausibly ship production-ready code on the first pass.
|
||||
|
||||
The PRP can be small and focusing on a single task or large and covering multiple tasks.
|
||||
The true power of PRP is in the ability to chain tasks together in a PRP to build, self-validate and ship complex features.
|
||||
28
use-cases/mcp-server/PRPs/ai_docs/claude_api_usage.md
Normal file
28
use-cases/mcp-server/PRPs/ai_docs/claude_api_usage.md
Normal file
@ -0,0 +1,28 @@
|
||||
### Example usage of the Anthropic API for Claude (model and API key are both environment variables)
|
||||
|
||||
const response = await fetch('https://api.anthropic.com/v1/messages', {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'x-api-key': this.apiKey,
|
||||
'anthropic-version': '2023-06-01'
|
||||
},
|
||||
body: JSON.stringify({
|
||||
model: this.model,
|
||||
max_tokens: 3000,
|
||||
messages: [{
|
||||
role: 'user',
|
||||
content: this.buildPRPParsingPrompt(prpContent, projectContext, config)
|
||||
}]
|
||||
})
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`Anthropic API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
const result = await response.json();
|
||||
const content = (result as any).content[0].text;
|
||||
|
||||
// Parse the JSON response
|
||||
const aiTasks = JSON.parse(content);
|
||||
491
use-cases/mcp-server/PRPs/ai_docs/mcp_patterns.md
Normal file
491
use-cases/mcp-server/PRPs/ai_docs/mcp_patterns.md
Normal file
@ -0,0 +1,491 @@
|
||||
# MCP Server Development Patterns
|
||||
|
||||
This document contains proven patterns for developing Model Context Protocol (MCP) servers using TypeScript and Cloudflare Workers, based on the implementation in this codebase.
|
||||
|
||||
## Core MCP Server Architecture
|
||||
|
||||
### Base Server Class Pattern
|
||||
|
||||
```typescript
|
||||
import { McpAgent } from "agents/mcp";
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { z } from "zod";
|
||||
|
||||
// Authentication props from OAuth flow
|
||||
type Props = {
|
||||
login: string;
|
||||
name: string;
|
||||
email: string;
|
||||
accessToken: string;
|
||||
};
|
||||
|
||||
export class CustomMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
server = new McpServer({
|
||||
name: "Your MCP Server Name",
|
||||
version: "1.0.0",
|
||||
});
|
||||
|
||||
// CRITICAL: Implement cleanup for Durable Objects
|
||||
async cleanup(): Promise<void> {
|
||||
try {
|
||||
// Close database connections
|
||||
await closeDb();
|
||||
console.log('Database connections closed successfully');
|
||||
} catch (error) {
|
||||
console.error('Error during database cleanup:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// CRITICAL: Durable Objects alarm handler
|
||||
async alarm(): Promise<void> {
|
||||
await this.cleanup();
|
||||
}
|
||||
|
||||
// Initialize all tools and resources
|
||||
async init() {
|
||||
// Register tools here
|
||||
this.registerTools();
|
||||
|
||||
// Register resources if needed
|
||||
this.registerResources();
|
||||
}
|
||||
|
||||
private registerTools() {
|
||||
// Tool registration logic
|
||||
}
|
||||
|
||||
private registerResources() {
|
||||
// Resource registration logic
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Tool Registration Pattern
|
||||
|
||||
```typescript
|
||||
// Basic tool registration
|
||||
this.server.tool(
|
||||
"toolName",
|
||||
"Tool description for the LLM",
|
||||
{
|
||||
param1: z.string().describe("Parameter description"),
|
||||
param2: z.number().optional().describe("Optional parameter"),
|
||||
},
|
||||
async ({ param1, param2 }) => {
|
||||
try {
|
||||
// Tool implementation
|
||||
const result = await performOperation(param1, param2);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Success: ${JSON.stringify(result, null, 2)}`
|
||||
}
|
||||
]
|
||||
};
|
||||
} catch (error) {
|
||||
console.error('Tool error:', error);
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Error: ${error.message}`,
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
}
|
||||
);
|
||||
```
|
||||
|
||||
### Conditional Tool Registration (Based on Permissions)
|
||||
|
||||
```typescript
|
||||
// Permission-based tool availability
|
||||
const ALLOWED_USERNAMES = new Set<string>([
|
||||
'admin1',
|
||||
'admin2'
|
||||
]);
|
||||
|
||||
// Register privileged tools only for authorized users
|
||||
if (ALLOWED_USERNAMES.has(this.props.login)) {
|
||||
this.server.tool(
|
||||
"privilegedTool",
|
||||
"Tool only available to authorized users",
|
||||
{ /* parameters */ },
|
||||
async (params) => {
|
||||
// Privileged operation
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Privileged operation executed by: ${this.props.login}`
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## Database Integration Patterns
|
||||
|
||||
### Database Connection Pattern
|
||||
|
||||
```typescript
|
||||
import { withDatabase, validateSqlQuery, isWriteOperation, formatDatabaseError } from "./database";
|
||||
|
||||
// Database operation with connection management
|
||||
async function performDatabaseOperation(sql: string) {
|
||||
try {
|
||||
// Validate SQL query
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Invalid SQL query: ${validation.error}`,
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
|
||||
// Execute with automatic connection management
|
||||
return await withDatabase(this.env.DATABASE_URL, async (db) => {
|
||||
const results = await db.unsafe(sql);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Query Results**\n\`\`\`sql\n${sql}\n\`\`\`\n\n**Results:**\n\`\`\`json\n${JSON.stringify(results, null, 2)}\n\`\`\`\n\n**Rows returned:** ${Array.isArray(results) ? results.length : 1}`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Database operation error:', error);
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Database error: ${formatDatabaseError(error)}`,
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Read vs Write Operation Handling
|
||||
|
||||
```typescript
|
||||
// Check if operation is read-only
|
||||
if (isWriteOperation(sql)) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: "Write operations are not allowed with this tool. Use the privileged tool if you have write permissions.",
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Authentication & Authorization Patterns
|
||||
|
||||
### OAuth Integration Pattern
|
||||
|
||||
```typescript
|
||||
import OAuthProvider from "@cloudflare/workers-oauth-provider";
|
||||
import { GitHubHandler } from "./github-handler";
|
||||
|
||||
// OAuth configuration
|
||||
export default new OAuthProvider({
|
||||
apiHandlers: {
|
||||
'/sse': MyMCP.serveSSE('/sse') as any,
|
||||
'/mcp': MyMCP.serve('/mcp') as any,
|
||||
},
|
||||
authorizeEndpoint: "/authorize",
|
||||
clientRegistrationEndpoint: "/register",
|
||||
defaultHandler: GitHubHandler as any,
|
||||
tokenEndpoint: "/token",
|
||||
});
|
||||
```
|
||||
|
||||
### User Permission Checking
|
||||
|
||||
```typescript
|
||||
// Permission validation pattern
|
||||
function hasPermission(username: string, operation: string): boolean {
|
||||
const WRITE_PERMISSIONS = new Set(['admin1', 'admin2']);
|
||||
const READ_PERMISSIONS = new Set(['user1', 'user2', ...WRITE_PERMISSIONS]);
|
||||
|
||||
switch (operation) {
|
||||
case 'read':
|
||||
return READ_PERMISSIONS.has(username);
|
||||
case 'write':
|
||||
return WRITE_PERMISSIONS.has(username);
|
||||
default:
|
||||
return false;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Error Handling Patterns
|
||||
|
||||
### Standardized Error Response
|
||||
|
||||
```typescript
|
||||
// Error response pattern
|
||||
function createErrorResponse(error: Error, operation: string) {
|
||||
console.error(`${operation} error:`, error);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `${operation} failed: ${error.message}`,
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
### Database Error Formatting
|
||||
|
||||
```typescript
|
||||
// Use the built-in database error formatter
|
||||
import { formatDatabaseError } from "./database";
|
||||
|
||||
try {
|
||||
// Database operation
|
||||
} catch (error) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Database error: ${formatDatabaseError(error)}`,
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Resource Registration Patterns
|
||||
|
||||
### Basic Resource Pattern
|
||||
|
||||
```typescript
|
||||
// Resource registration
|
||||
this.server.resource(
|
||||
"resource://example/{id}",
|
||||
"Resource description",
|
||||
async (uri) => {
|
||||
const id = uri.path.split('/').pop();
|
||||
|
||||
try {
|
||||
const data = await fetchResourceData(id);
|
||||
|
||||
return {
|
||||
contents: [
|
||||
{
|
||||
uri: uri.href,
|
||||
mimeType: "application/json",
|
||||
text: JSON.stringify(data, null, 2)
|
||||
}
|
||||
]
|
||||
};
|
||||
} catch (error) {
|
||||
throw new Error(`Failed to fetch resource: ${error.message}`);
|
||||
}
|
||||
}
|
||||
);
|
||||
```
|
||||
|
||||
## Testing Patterns
|
||||
|
||||
### Tool Testing Pattern
|
||||
|
||||
```typescript
|
||||
// Test tool functionality
|
||||
async function testTool(toolName: string, params: any) {
|
||||
try {
|
||||
const result = await server.callTool(toolName, params);
|
||||
console.log(`${toolName} test passed:`, result);
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error(`${toolName} test failed:`, error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Database Connection Testing
|
||||
|
||||
```typescript
|
||||
// Test database connectivity
|
||||
async function testDatabaseConnection() {
|
||||
try {
|
||||
await withDatabase(process.env.DATABASE_URL, async (db) => {
|
||||
const result = await db`SELECT 1 as test`;
|
||||
console.log('Database connection test passed:', result);
|
||||
});
|
||||
return true;
|
||||
} catch (error) {
|
||||
console.error('Database connection test failed:', error);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
### Input Validation
|
||||
|
||||
```typescript
|
||||
// Always validate inputs with Zod
|
||||
const inputSchema = z.object({
|
||||
query: z.string().min(1).max(1000),
|
||||
parameters: z.array(z.string()).optional()
|
||||
});
|
||||
|
||||
// In tool handler
|
||||
try {
|
||||
const validated = inputSchema.parse(params);
|
||||
// Use validated data
|
||||
} catch (error) {
|
||||
return createErrorResponse(error, "Input validation");
|
||||
}
|
||||
```
|
||||
|
||||
### SQL Injection Prevention
|
||||
|
||||
```typescript
|
||||
// Use the built-in SQL validation
|
||||
import { validateSqlQuery } from "./database";
|
||||
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(new Error(validation.error), "SQL validation");
|
||||
}
|
||||
```
|
||||
|
||||
### Access Control
|
||||
|
||||
```typescript
|
||||
// Always check permissions before executing sensitive operations
|
||||
if (!hasPermission(this.props.login, 'write')) {
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: "Access denied: insufficient permissions",
|
||||
isError: true
|
||||
}
|
||||
]
|
||||
};
|
||||
}
|
||||
```
|
||||
|
||||
## Performance Patterns
|
||||
|
||||
### Connection Pooling
|
||||
|
||||
```typescript
|
||||
// Use the built-in connection pooling
|
||||
import { withDatabase } from "./database";
|
||||
|
||||
// The withDatabase function handles connection pooling automatically
|
||||
await withDatabase(databaseUrl, async (db) => {
|
||||
// Database operations
|
||||
});
|
||||
```
|
||||
|
||||
### Resource Cleanup
|
||||
|
||||
```typescript
|
||||
// Implement proper cleanup in Durable Objects
|
||||
async cleanup(): Promise<void> {
|
||||
try {
|
||||
// Close database connections
|
||||
await closeDb();
|
||||
|
||||
// Clean up other resources
|
||||
await cleanupResources();
|
||||
|
||||
console.log('Cleanup completed successfully');
|
||||
} catch (error) {
|
||||
console.error('Cleanup error:', error);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Common Gotchas
|
||||
|
||||
### 1. Missing Cleanup Implementation
|
||||
- Always implement `cleanup()` method in Durable Objects
|
||||
- Handle database connection cleanup properly
|
||||
- Set up alarm handler for automatic cleanup
|
||||
|
||||
### 2. SQL Injection Vulnerabilities
|
||||
- Always use `validateSqlQuery()` before executing SQL
|
||||
- Never concatenate user input directly into SQL strings
|
||||
- Use parameterized queries when possible
|
||||
|
||||
### 3. Permission Bypasses
|
||||
- Check permissions for every sensitive operation
|
||||
- Don't rely on tool registration alone for security
|
||||
- Always validate user identity from props
|
||||
|
||||
### 4. Error Information Leakage
|
||||
- Use `formatDatabaseError()` to sanitize error messages
|
||||
- Don't expose internal system details in error responses
|
||||
- Log detailed errors server-side, return generic messages to client
|
||||
|
||||
### 5. Resource Leaks
|
||||
- Always use `withDatabase()` for database operations
|
||||
- Implement proper error handling in async operations
|
||||
- Clean up resources in finally blocks
|
||||
|
||||
## Environment Configuration
|
||||
|
||||
### Required Environment Variables
|
||||
|
||||
```typescript
|
||||
// Environment type definition
|
||||
interface Env {
|
||||
DATABASE_URL: string;
|
||||
GITHUB_CLIENT_ID: string;
|
||||
GITHUB_CLIENT_SECRET: string;
|
||||
OAUTH_KV: KVNamespace;
|
||||
// Add other bindings as needed
|
||||
}
|
||||
```
|
||||
|
||||
### Wrangler Configuration Pattern
|
||||
|
||||
```toml
|
||||
# wrangler.toml
|
||||
name = "mcp-server"
|
||||
main = "src/index.ts"
|
||||
compatibility_date = "2024-01-01"
|
||||
|
||||
[[kv_namespaces]]
|
||||
binding = "OAUTH_KV"
|
||||
id = "your-kv-namespace-id"
|
||||
|
||||
[env.production]
|
||||
# Production-specific configuration
|
||||
```
|
||||
|
||||
This document provides the core patterns for building secure, scalable MCP servers using the proven architecture in this codebase.
|
||||
538
use-cases/mcp-server/PRPs/templates/prp_mcp_base.md
Normal file
538
use-cases/mcp-server/PRPs/templates/prp_mcp_base.md
Normal file
@ -0,0 +1,538 @@
|
||||
---
|
||||
name: "MCP Server PRP Template"
|
||||
description: This template is designed to provide a production-ready Model Context Protocol (MCP) server using the proven patterns from this codebase.
|
||||
---
|
||||
|
||||
## Purpose
|
||||
|
||||
Template optimized for AI agents to implement production-ready Model Context Protocol (MCP) servers with GitHub OAuth authentication, database integration, and Cloudflare Workers deployment using the proven patterns from this codebase.
|
||||
|
||||
## Core Principles
|
||||
|
||||
1. **Context is King**: Include ALL necessary MCP patterns, authentication flows, and deployment configurations
|
||||
2. **Validation Loops**: Provide executable tests from TypeScript compilation to production deployment
|
||||
3. **Security First**: Build-in authentication, authorization, and SQL injection protection
|
||||
4. **Production Ready**: Include monitoring, error handling, and deployment automation
|
||||
|
||||
---
|
||||
|
||||
## Goal
|
||||
|
||||
Build a production-ready MCP (Model Context Protocol) server with:
|
||||
|
||||
- [SPECIFIC MCP FUNCTIONALITY] - describe the specific tools and resources to implement
|
||||
- GitHub OAuth authentication with role-based access control
|
||||
- Cloudflare Workers deployment with monitoring
|
||||
- [ADDITIONAL FEATURES] - any specific features beyond the base authentication/database
|
||||
|
||||
## Why
|
||||
|
||||
- **Developer Productivity**: Enable secure AI assistant access to [SPECIFIC DATA/OPERATIONS]
|
||||
- **Enterprise Security**: GitHub OAuth with granular permission system
|
||||
- **Scalability**: Cloudflare Workers global edge deployment
|
||||
- **Integration**: [HOW THIS FITS WITH EXISTING SYSTEMS]
|
||||
- **User Value**: [SPECIFIC BENEFITS TO END USERS]
|
||||
|
||||
## What
|
||||
|
||||
### MCP Server Features
|
||||
|
||||
**Core MCP Tools:**
|
||||
|
||||
- Tools are organized in modular files and registered via `src/tools/register-tools.ts`
|
||||
- Each feature/domain gets its own tool registration file (e.g., `database-tools.ts`, `analytics-tools.ts`)
|
||||
- [LIST SPECIFIC TOOLS] - e.g., "queryDatabase", "listTables", "executeOperations"
|
||||
- User authentication and permission validation happens during tool registration
|
||||
- Comprehensive error handling and logging
|
||||
- [DOMAIN-SPECIFIC TOOLS] - tools specific to your use case
|
||||
|
||||
**Authentication & Authorization:**
|
||||
|
||||
- GitHub OAuth 2.0 integration with signed cookie approval system
|
||||
- Role-based access control (read-only vs privileged users)
|
||||
- User context propagation to all MCP tools
|
||||
- Secure session management with HMAC-signed cookies
|
||||
|
||||
**Database Integration:**
|
||||
|
||||
- PostgreSQL connection pooling with automatic cleanup
|
||||
- SQL injection protection and query validation
|
||||
- Read/write operation separation based on user permissions
|
||||
- Error sanitization to prevent information leakage
|
||||
|
||||
**Deployment & Monitoring:**
|
||||
|
||||
- Cloudflare Workers with Durable Objects for state management
|
||||
- Optional Sentry integration for error tracking and performance monitoring
|
||||
- Environment-based configuration (development vs production)
|
||||
- Real-time logging and alerting
|
||||
|
||||
### Success Criteria
|
||||
|
||||
- [ ] MCP server passes validation with MCP Inspector
|
||||
- [ ] GitHub OAuth flow works end-to-end (authorization → callback → MCP access)
|
||||
- [ ] TypeScript compilation succeeds with no errors
|
||||
- [ ] Local development server starts and responds correctly
|
||||
- [ ] Production deployment to Cloudflare Workers succeeds
|
||||
- [ ] Authentication prevents unauthorized access to sensitive operations
|
||||
- [ ] Error handling provides user-friendly messages without leaking system details
|
||||
- [ ] [DOMAIN-SPECIFIC SUCCESS CRITERIA]
|
||||
|
||||
## All Needed Context
|
||||
|
||||
### Documentation & References (MUST READ)
|
||||
|
||||
```yaml
|
||||
# CRITICAL MCP PATTERNS - Read these first
|
||||
- docfile: PRPs/ai_docs/mcp_patterns.md
|
||||
why: Core MCP development patterns, security practices, and error handling
|
||||
|
||||
# Critial code examples
|
||||
- docfile: PRPs/ai_docs/claude_api_usage.md
|
||||
why: How to use the Anthropic API to get a response from an LLM
|
||||
|
||||
# TOOL REGISTRATION SYSTEM - Understand the modular approach
|
||||
- file: src/tools/register-tools.ts
|
||||
why: Central registry showing how all tools are imported and registered - STUDY this pattern
|
||||
|
||||
# EXAMPLE MCP TOOLS - Look here how to create and register new tools
|
||||
- file: examples/database-tools.ts
|
||||
why: Example tools for a Postgres MCP server showing best practices for tool creation and registration
|
||||
|
||||
- file: examples/database-tools-sentry.ts
|
||||
why: Example tools for the Postgres MCP server but with the Sentry integration for production monitoring
|
||||
|
||||
# EXISTING CODEBASE PATTERNS - Study these implementations
|
||||
- file: src/index.ts
|
||||
why: Complete MCP server with authentication, database, and tools - MIRROR this pattern
|
||||
|
||||
- file: src/github-handler.ts
|
||||
why: OAuth flow implementation - USE this exact pattern for authentication
|
||||
|
||||
- file: src/database.ts
|
||||
why: Database security, connection pooling, SQL validation - FOLLOW these patterns
|
||||
|
||||
- file: wrangler.jsonc
|
||||
why: Cloudflare Workers configuration - COPY this pattern for deployment
|
||||
|
||||
# OFFICIAL MCP DOCUMENTATION
|
||||
- url: https://modelcontextprotocol.io/docs/concepts/tools
|
||||
why: MCP tool registration and schema definition patterns
|
||||
|
||||
- url: https://modelcontextprotocol.io/docs/concepts/resources
|
||||
why: MCP resource implementation if needed
|
||||
|
||||
# Add n documentation related to the users use case as needed below
|
||||
```
|
||||
|
||||
### Current Codebase Tree (Run `tree -I node_modules` in project root)
|
||||
|
||||
```bash
|
||||
# INSERT ACTUAL TREE OUTPUT HERE
|
||||
/
|
||||
├── src/
|
||||
│ ├── index.ts # Main authenticated MCP server ← STUDY THIS
|
||||
│ ├── index_sentry.ts # Sentry monitoring version
|
||||
│ ├── simple-math.ts # Basic MCP example ← GOOD STARTING POINT
|
||||
│ ├── github-handler.ts # OAuth implementation ← USE THIS PATTERN
|
||||
│ ├── database.ts # Database utilities ← SECURITY PATTERNS
|
||||
│ ├── utils.ts # OAuth helpers
|
||||
│ ├── workers-oauth-utils.ts # Cookie security system
|
||||
│ └── tools/ # Tool registration system
|
||||
│ └── register-tools.ts # Central tool registry ← UNDERSTAND THIS
|
||||
├── PRPs/
|
||||
│ ├── templates/prp_mcp_base.md # This template
|
||||
│ └── ai_docs/ # Implementation guides ← READ ALL
|
||||
├── examples/ # Example tool implementations
|
||||
│ ├── database-tools.ts # Database tools example ← FOLLOW PATTERN
|
||||
│ └── database-tools-sentry.ts # With Sentry monitoring
|
||||
├── wrangler.jsonc # Cloudflare config ← COPY PATTERNS
|
||||
├── package.json # Dependencies
|
||||
└── tsconfig.json # TypeScript config
|
||||
```
|
||||
|
||||
### Desired Codebase Tree (Files to add/modify) related to the users use case as needed below
|
||||
|
||||
```bash
|
||||
|
||||
```
|
||||
|
||||
### Known Gotchas & Critical MCP/Cloudflare Patterns
|
||||
|
||||
```typescript
|
||||
// CRITICAL: Cloudflare Workers require specific patterns
|
||||
// 1. ALWAYS implement cleanup for Durable Objects
|
||||
export class YourMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
async cleanup(): Promise<void> {
|
||||
await closeDb(); // CRITICAL: Close database connections
|
||||
}
|
||||
|
||||
async alarm(): Promise<void> {
|
||||
await this.cleanup(); // CRITICAL: Handle Durable Object alarms
|
||||
}
|
||||
}
|
||||
|
||||
// 2. ALWAYS validate SQL to prevent injection (use existing patterns)
|
||||
const validation = validateSqlQuery(sql); // from src/database.ts
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(validation.error);
|
||||
}
|
||||
|
||||
// 3. ALWAYS check permissions before sensitive operations
|
||||
const ALLOWED_USERNAMES = new Set(["admin1", "admin2"]);
|
||||
if (!ALLOWED_USERNAMES.has(this.props.login)) {
|
||||
return createErrorResponse("Insufficient permissions");
|
||||
}
|
||||
|
||||
// 4. ALWAYS use withDatabase wrapper for connection management
|
||||
return await withDatabase(this.env.DATABASE_URL, async (db) => {
|
||||
// Database operations here
|
||||
});
|
||||
|
||||
// 5. ALWAYS use Zod for input validation
|
||||
import { z } from "zod";
|
||||
const schema = z.object({
|
||||
param: z.string().min(1).max(100),
|
||||
});
|
||||
|
||||
// 6. TypeScript compilation requires exact interface matching
|
||||
interface Env {
|
||||
DATABASE_URL: string;
|
||||
GITHUB_CLIENT_ID: string;
|
||||
GITHUB_CLIENT_SECRET: string;
|
||||
OAUTH_KV: KVNamespace;
|
||||
// Add your environment variables here
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Blueprint
|
||||
|
||||
### Data Models & Types
|
||||
|
||||
Define TypeScript interfaces and Zod schemas for type safety and validation.
|
||||
|
||||
```typescript
|
||||
// User authentication props (inherited from OAuth)
|
||||
type Props = {
|
||||
login: string; // GitHub username
|
||||
name: string; // Display name
|
||||
email: string; // Email address
|
||||
accessToken: string; // GitHub access token
|
||||
};
|
||||
|
||||
// MCP tool input schemas (customize for your tools)
|
||||
const YourToolSchema = z.object({
|
||||
param1: z.string().min(1, "Parameter cannot be empty"),
|
||||
param2: z.number().int().positive().optional(),
|
||||
options: z.object({}).optional(),
|
||||
});
|
||||
|
||||
// Environment interface (add your variables)
|
||||
interface Env {
|
||||
DATABASE_URL: string;
|
||||
GITHUB_CLIENT_ID: string;
|
||||
GITHUB_CLIENT_SECRET: string;
|
||||
OAUTH_KV: KVNamespace;
|
||||
// YOUR_SPECIFIC_ENV_VAR: string;
|
||||
}
|
||||
|
||||
// Permission levels (customize for your use case)
|
||||
enum Permission {
|
||||
READ = "read",
|
||||
WRITE = "write",
|
||||
ADMIN = "admin",
|
||||
}
|
||||
```
|
||||
|
||||
### List of Tasks (Complete in order)
|
||||
|
||||
```yaml
|
||||
Task 1 - Project Setup:
|
||||
COPY wrangler.jsonc to wrangler-[server-name].jsonc:
|
||||
- MODIFY name field to "[server-name]"
|
||||
- ADD any new environment variables to vars section
|
||||
- KEEP existing OAuth and database configuration
|
||||
|
||||
CREATE .dev.vars file (if not exists):
|
||||
- ADD GITHUB_CLIENT_ID=your_client_id
|
||||
- ADD GITHUB_CLIENT_SECRET=your_client_secret
|
||||
- ADD DATABASE_URL=postgresql://...
|
||||
- ADD COOKIE_ENCRYPTION_KEY=your_32_byte_key
|
||||
- ADD any domain-specific environment variables
|
||||
|
||||
Task 2 - GitHub OAuth App:
|
||||
CREATE new GitHub OAuth app:
|
||||
- SET homepage URL: https://your-worker.workers.dev
|
||||
- SET callback URL: https://your-worker.workers.dev/callback
|
||||
- COPY client ID and secret to .dev.vars
|
||||
|
||||
OR REUSE existing OAuth app:
|
||||
- UPDATE callback URL if using different subdomain
|
||||
- VERIFY client ID and secret in environment
|
||||
|
||||
Task 3 - MCP Server Implementation:
|
||||
CREATE src/[server-name].ts OR MODIFY src/index.ts:
|
||||
- COPY class structure from src/index.ts
|
||||
- MODIFY server name and version in McpServer constructor
|
||||
- CALL registerAllTools(server, env, props) in init() method
|
||||
- KEEP authentication and database patterns identical
|
||||
|
||||
CREATE tool modules:
|
||||
- CREATE new tool files following examples/database-tools.ts pattern
|
||||
- EXPORT registration functions that accept (server, env, props)
|
||||
- USE Zod schemas for input validation
|
||||
- IMPLEMENT proper error handling with createErrorResponse
|
||||
- ADD permission checking during tool registration
|
||||
|
||||
UPDATE tool registry:
|
||||
- MODIFY src/tools/register-tools.ts to import your new tools
|
||||
- ADD your registration function call in registerAllTools()
|
||||
|
||||
Task 4 - Database Integration (if needed):
|
||||
USE existing database patterns from src/database.ts:
|
||||
- IMPORT withDatabase, validateSqlQuery, isWriteOperation
|
||||
- IMPLEMENT database operations with security validation
|
||||
- SEPARATE read vs write operations based on user permissions
|
||||
- USE formatDatabaseError for user-friendly error messages
|
||||
|
||||
Task 5 - Environment Configuration:
|
||||
SETUP Cloudflare KV namespace:
|
||||
- RUN: wrangler kv namespace create "OAUTH_KV"
|
||||
- UPDATE wrangler.jsonc with returned namespace ID
|
||||
|
||||
SET production secrets:
|
||||
- RUN: wrangler secret put GITHUB_CLIENT_ID
|
||||
- RUN: wrangler secret put GITHUB_CLIENT_SECRET
|
||||
- RUN: wrangler secret put DATABASE_URL
|
||||
- RUN: wrangler secret put COOKIE_ENCRYPTION_KEY
|
||||
|
||||
Task 6 - Local Testing:
|
||||
TEST basic functionality:
|
||||
- RUN: wrangler dev
|
||||
- VERIFY server starts without errors
|
||||
- TEST OAuth flow: http://localhost:8792/authorize
|
||||
- VERIFY MCP endpoint: http://localhost:8792/mcp
|
||||
|
||||
Task 7 - Production Deployment:
|
||||
DEPLOY to Cloudflare Workers:
|
||||
- RUN: wrangler deploy
|
||||
- VERIFY deployment success
|
||||
- TEST production OAuth flow
|
||||
- VERIFY MCP endpoint accessibility
|
||||
```
|
||||
|
||||
### Per Task Implementation Details
|
||||
|
||||
```typescript
|
||||
// Task 3 - MCP Server Implementation Pattern
|
||||
export class YourMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
server = new McpServer({
|
||||
name: "Your MCP Server Name",
|
||||
version: "1.0.0",
|
||||
});
|
||||
|
||||
// CRITICAL: Always implement cleanup
|
||||
async cleanup(): Promise<void> {
|
||||
try {
|
||||
await closeDb();
|
||||
console.log("Database connections closed successfully");
|
||||
} catch (error) {
|
||||
console.error("Error during database cleanup:", error);
|
||||
}
|
||||
}
|
||||
|
||||
async alarm(): Promise<void> {
|
||||
await this.cleanup();
|
||||
}
|
||||
|
||||
async init() {
|
||||
// PATTERN: Use centralized tool registration
|
||||
registerAllTools(this.server, this.env, this.props);
|
||||
}
|
||||
}
|
||||
|
||||
// Task 3 - Tool Module Pattern (e.g., src/tools/your-feature-tools.ts)
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { Props } from "../types";
|
||||
import { z } from "zod";
|
||||
|
||||
const PRIVILEGED_USERS = new Set(["admin1", "admin2"]);
|
||||
|
||||
export function registerYourFeatureTools(server: McpServer, env: Env, props: Props) {
|
||||
// Tool 1: Available to all authenticated users
|
||||
server.tool(
|
||||
"yourBasicTool",
|
||||
"Description of your basic tool",
|
||||
YourToolSchema, // Zod validation schema
|
||||
async ({ param1, param2, options }) => {
|
||||
try {
|
||||
// PATTERN: Tool implementation with error handling
|
||||
const result = await performOperation(param1, param2, options);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Success**\n\nOperation completed\n\n**Result:**\n\`\`\`json\n${JSON.stringify(result, null, 2)}\n\`\`\``,
|
||||
},
|
||||
],
|
||||
};
|
||||
} catch (error) {
|
||||
return createErrorResponse(`Operation failed: ${error.message}`);
|
||||
}
|
||||
},
|
||||
);
|
||||
|
||||
// Tool 2: Only for privileged users
|
||||
if (PRIVILEGED_USERS.has(props.login)) {
|
||||
server.tool(
|
||||
"privilegedTool",
|
||||
"Administrative tool for privileged users",
|
||||
{ action: z.string() },
|
||||
async ({ action }) => {
|
||||
// Implementation
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `Admin action '${action}' executed by ${props.login}`,
|
||||
},
|
||||
],
|
||||
};
|
||||
},
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
// Task 3 - Update Tool Registry (src/tools/register-tools.ts)
|
||||
import { registerYourFeatureTools } from "./your-feature-tools";
|
||||
|
||||
export function registerAllTools(server: McpServer, env: Env, props: Props) {
|
||||
// Existing registrations
|
||||
registerDatabaseTools(server, env, props);
|
||||
|
||||
// Add your new registration
|
||||
registerYourFeatureTools(server, env, props);
|
||||
}
|
||||
|
||||
// PATTERN: Export OAuth provider with MCP endpoints
|
||||
export default new OAuthProvider({
|
||||
apiHandlers: {
|
||||
"/sse": YourMCP.serveSSE("/sse") as any,
|
||||
"/mcp": YourMCP.serve("/mcp") as any,
|
||||
},
|
||||
authorizeEndpoint: "/authorize",
|
||||
clientRegistrationEndpoint: "/register",
|
||||
defaultHandler: GitHubHandler as any,
|
||||
tokenEndpoint: "/token",
|
||||
});
|
||||
```
|
||||
|
||||
### Integration Points
|
||||
|
||||
```yaml
|
||||
CLOUDFLARE_WORKERS:
|
||||
- wrangler.jsonc: Update name, environment variables, KV bindings
|
||||
- Environment secrets: GitHub OAuth credentials, database URL, encryption key
|
||||
- Durable Objects: Configure MCP agent binding for state persistence
|
||||
|
||||
GITHUB_OAUTH:
|
||||
- GitHub App: Create with callback URL matching your Workers domain
|
||||
- Client credentials: Store as Cloudflare Workers secrets
|
||||
- Callback URL: Must match exactly: https://your-worker.workers.dev/callback
|
||||
|
||||
DATABASE:
|
||||
- PostgreSQL connection: Use existing connection pooling patterns
|
||||
- Environment variable: DATABASE_URL with full connection string
|
||||
- Security: Use validateSqlQuery and isWriteOperation for all SQL
|
||||
|
||||
ENVIRONMENT_VARIABLES:
|
||||
- Development: .dev.vars file for local testing
|
||||
- Production: Cloudflare Workers secrets for deployment
|
||||
- Required: GITHUB_CLIENT_ID, GITHUB_CLIENT_SECRET, DATABASE_URL, COOKIE_ENCRYPTION_KEY
|
||||
|
||||
KV_STORAGE:
|
||||
- OAuth state: Used by OAuth provider for state management
|
||||
- Namespace: Create with `wrangler kv namespace create "OAUTH_KV"`
|
||||
- Configuration: Add namespace ID to wrangler.jsonc bindings
|
||||
```
|
||||
|
||||
## Validation Gate
|
||||
|
||||
### Level 1: TypeScript & Configuration
|
||||
|
||||
```bash
|
||||
# CRITICAL: Run these FIRST - fix any errors before proceeding
|
||||
npm run type-check # TypeScript compilation
|
||||
wrangler types # Generate Cloudflare Workers types
|
||||
|
||||
# Expected: No TypeScript errors
|
||||
# If errors: Fix type issues, missing interfaces, import problems
|
||||
```
|
||||
|
||||
### Level 2: Local Development Testing
|
||||
|
||||
```bash
|
||||
# Start local development server
|
||||
wrangler dev
|
||||
|
||||
# Test OAuth flow (should redirect to GitHub)
|
||||
curl -v http://localhost:8792/authorize
|
||||
|
||||
# Test MCP endpoint (should return server info)
|
||||
curl -v http://localhost:8792/mcp
|
||||
|
||||
# Expected: Server starts, OAuth redirects to GitHub, MCP responds with server info
|
||||
# If errors: Check console output, verify environment variables, fix configuration
|
||||
```
|
||||
|
||||
### Level 3: Unit test each feature, function, and file, following existing testing patterns if they are there.
|
||||
|
||||
```bash
|
||||
npm run test
|
||||
```
|
||||
|
||||
Run unit tests with the above command (Vitest) to make sure all functionality is working.
|
||||
|
||||
### Level 4: Database Integration Testing (if applicable)
|
||||
|
||||
```bash
|
||||
# Test database connection
|
||||
curl -X POST http://localhost:8792/mcp \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"method": "tools/call", "params": {"name": "listTables", "arguments": {}}}'
|
||||
|
||||
# Test permission validation
|
||||
# Test SQL injection protection and other kinds of security if applicable
|
||||
# Test error handling for database failures
|
||||
|
||||
# Expected: Database operations work, permissions enforced, errors handled gracefully, etc.
|
||||
# If errors: Check DATABASE_URL, connection settings, permission logic
|
||||
```
|
||||
|
||||
## Final Validation Checklist
|
||||
|
||||
### Core Functionality
|
||||
|
||||
- [ ] TypeScript compilation: `npm run type-check` passes
|
||||
- [ ] Unit tests pass: `npm run test` passes
|
||||
- [ ] Local server starts: `wrangler dev` runs without errors
|
||||
- [ ] MCP endpoint responds: `curl http://localhost:8792/mcp` returns server info
|
||||
- [ ] OAuth flow works: Authentication redirects and completes successfully
|
||||
|
||||
---
|
||||
|
||||
## Anti-Patterns to Avoid
|
||||
|
||||
### MCP-Specific
|
||||
|
||||
- ❌ Don't skip input validation with Zod - always validate tool parameters
|
||||
- ❌ Don't forget to implement cleanup() method for Durable Objects
|
||||
- ❌ Don't hardcode user permissions - use configurable permission systems
|
||||
|
||||
### Development Process
|
||||
|
||||
- ❌ Don't skip the validation loops - each level catches different issues
|
||||
- ❌ Don't guess about OAuth configuration - test the full flow
|
||||
- ❌ Don't deploy without monitoring - implement logging and error tracking
|
||||
- ❌ Don't ignore TypeScript errors - fix all type issues before deployment
|
||||
265
use-cases/mcp-server/README.md
Normal file
265
use-cases/mcp-server/README.md
Normal file
@ -0,0 +1,265 @@
|
||||
# MCP Server Builder - Context Engineering Use Case
|
||||
|
||||
This use case demonstrates how to use **Context Engineering** and the **PRP (Product Requirements Prompt) process** to build production-ready Model Context Protocol (MCP) servers. It provides a proven template and workflow for creating MCP servers with GitHub OAuth authentication, database integration, and Cloudflare Workers deployment.
|
||||
|
||||
> A PRP is PRD + curated codebase intelligence + agent/runbook—the minimum viable packet an AI needs to plausibly ship production-ready code on the first pass.
|
||||
|
||||
## 🎯 What You'll Learn
|
||||
|
||||
This use case teaches you how to:
|
||||
|
||||
- **Use the PRP process** to systematically build complex MCP servers
|
||||
- **Leverage specialized context engineering** for MCP development
|
||||
- **Follow proven patterns** from a production-ready MCP server template
|
||||
- **Implement secure authentication** with GitHub OAuth and role-based access
|
||||
- **Deploy to Cloudflare Workers** with monitoring and error handling
|
||||
|
||||
## 📋 How It Works - The PRP Process for MCP Servers
|
||||
|
||||
### 1. Define Your MCP Server (initial.md)
|
||||
|
||||
Start by describing the exact MCP server you want to build in `PRPs/INITIAL.md`:
|
||||
|
||||
```markdown
|
||||
## FEATURE:
|
||||
We want to create a weather MCP server that provides real-time weather data
|
||||
with caching and rate limiting.
|
||||
|
||||
## ADDITIONAL FEATURES:
|
||||
- Integration with OpenWeatherMap API
|
||||
- Redis caching for performance
|
||||
- Rate limiting per user
|
||||
- Historical weather data access
|
||||
- Location search and autocomplete
|
||||
|
||||
## OTHER CONSIDERATIONS:
|
||||
- API key management for external services
|
||||
- Proper error handling for API failures
|
||||
- Coordinate validation for location queries
|
||||
```
|
||||
|
||||
### 2. Generate Your PRP
|
||||
|
||||
Use the specialized MCP PRP command to create a comprehensive implementation plan:
|
||||
|
||||
```bash
|
||||
/prp-mcp-create INITIAL.md
|
||||
```
|
||||
|
||||
**What this does:**
|
||||
- Reads your feature request
|
||||
- Researches the existing MCP codebase patterns
|
||||
- Studies authentication and database integration patterns
|
||||
- Creates a comprehensive PRP in `PRPs/your-server-name.md`
|
||||
- Includes all context, validation loops, and step-by-step tasks
|
||||
|
||||
> It's important after your PRP is generated to validate everything! With the PRP framework, you are meant to be a part of the process to ensure the quality of all context! An execution is only as good as your PRP. Use /prp-mcp-create as a solid starting point.
|
||||
|
||||
### 3. Execute Your PRP
|
||||
|
||||
Use the specialized MCP execution command to build your server:
|
||||
|
||||
```bash
|
||||
/prp-mcp-execute PRPs/your-server-name.md
|
||||
```
|
||||
|
||||
**What this does:**
|
||||
- Loads the complete PRP with all context
|
||||
- Creates a detailed implementation plan using TodoWrite
|
||||
- Implements each component following proven patterns
|
||||
- Runs comprehensive validation (TypeScript, tests, deployment)
|
||||
- Ensures your MCP server works end-to-end
|
||||
|
||||
## 🏗️ MCP-Specific Context Engineering
|
||||
|
||||
This use case includes specialized context engineering components designed specifically for MCP server development:
|
||||
|
||||
### Specialized Slash Commands
|
||||
|
||||
Located in `.claude/commands/`:
|
||||
|
||||
- **`/prp-mcp-create`** - Generates PRPs specifically for MCP servers
|
||||
- **`/prp-mcp-execute`** - Executes MCP PRPs with comprehensive validation
|
||||
|
||||
These are specialized versions of the generic commands in the root `.claude/commands/`, but tailored for MCP development patterns.
|
||||
|
||||
### Specialized PRP Template
|
||||
|
||||
The template `PRPs/templates/prp_mcp_base.md` includes:
|
||||
|
||||
- **MCP-specific patterns** for tool registration and authentication
|
||||
- **Cloudflare Workers configuration** for deployment
|
||||
- **GitHub OAuth integration** patterns
|
||||
- **Database security** and SQL injection protection
|
||||
- **Comprehensive validation loops** from TypeScript to production
|
||||
|
||||
### AI Documentation
|
||||
|
||||
The `PRPs/ai_docs/` folder contains:
|
||||
|
||||
- **`mcp_patterns.md`** - Core MCP development patterns and security practices
|
||||
- **`claude_api_usage.md`** - How to integrate with Anthropic's API for LLM-powered features
|
||||
|
||||
## 🔧 Template Architecture
|
||||
|
||||
This template provides a complete, production-ready MCP server with:
|
||||
|
||||
### Core Components
|
||||
|
||||
```
|
||||
src/
|
||||
├── index.ts # Main authenticated MCP server
|
||||
├── index_sentry.ts # Version with Sentry monitoring
|
||||
├── simple-math.ts # Basic MCP example (no auth)
|
||||
├── github-handler.ts # Complete GitHub OAuth implementation
|
||||
├── database.ts # PostgreSQL with security patterns
|
||||
├── utils.ts # OAuth helpers and utilities
|
||||
├── workers-oauth-utils.ts # HMAC-signed cookie system
|
||||
└── tools/ # Modular tool registration system
|
||||
└── register-tools.ts # Central tool registry
|
||||
```
|
||||
|
||||
### Example Tools
|
||||
|
||||
The `examples/` folder shows how to create MCP tools:
|
||||
|
||||
- **`database-tools.ts`** - Example database tools with proper patterns
|
||||
- **`database-tools-sentry.ts`** - Same tools with Sentry monitoring
|
||||
|
||||
### Key Features
|
||||
|
||||
- **🔐 GitHub OAuth** - Complete authentication flow with role-based access
|
||||
- **🗄️ Database Integration** - PostgreSQL with connection pooling and security
|
||||
- **🛠️ Modular Tools** - Clean separation of concerns with central registration
|
||||
- **☁️ Cloudflare Workers** - Global edge deployment with Durable Objects
|
||||
- **📊 Monitoring** - Optional Sentry integration for production
|
||||
- **🧪 Testing** - Comprehensive validation from TypeScript to deployment
|
||||
|
||||
## 🚀 Quick Start
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Node.js and npm installed
|
||||
- Cloudflare account (free tier works)
|
||||
- GitHub account for OAuth
|
||||
- PostgreSQL database (local or hosted)
|
||||
|
||||
### Step 1: Clone and Setup
|
||||
|
||||
```bash
|
||||
# Clone the context engineering template
|
||||
git clone https://github.com/coleam00/Context-Engineering-Intro.git
|
||||
cd Context-Engineering-Intro/use-cases/mcp-server
|
||||
|
||||
# Install dependencies
|
||||
npm install
|
||||
|
||||
# Install Wrangler CLI globally
|
||||
npm install -g wrangler
|
||||
|
||||
# Authenticate with Cloudflare
|
||||
wrangler login
|
||||
```
|
||||
|
||||
### Step 2: Configure Environment
|
||||
|
||||
```bash
|
||||
# Create environment file
|
||||
cp .dev.vars.example .dev.vars
|
||||
|
||||
# Edit .dev.vars with your credentials
|
||||
# - GitHub OAuth app credentials
|
||||
# - Database connection string
|
||||
# - Cookie encryption key
|
||||
```
|
||||
|
||||
### Step 3: Define Your MCP Server
|
||||
|
||||
Edit `PRPs/INITIAL.md` to describe your specific MCP server requirements:
|
||||
|
||||
```markdown
|
||||
## FEATURE:
|
||||
Describe exactly what your MCP server should do - be specific about
|
||||
functionality, data sources, and user interactions.
|
||||
|
||||
## ADDITIONAL FEATURES:
|
||||
- List specific features beyond basic CRUD operations
|
||||
- Include integrations with external APIs
|
||||
- Mention any special requirements
|
||||
|
||||
## OTHER CONSIDERATIONS:
|
||||
- Authentication requirements
|
||||
- Performance considerations
|
||||
- Security requirements
|
||||
- Rate limiting needs
|
||||
```
|
||||
|
||||
### Step 4: Generate and Execute PRP
|
||||
|
||||
```bash
|
||||
# Generate comprehensive PRP
|
||||
/prp-mcp-create INITIAL.md
|
||||
|
||||
# Execute the PRP to build your server
|
||||
/prp-mcp-execute PRPs/your-server-name.md
|
||||
```
|
||||
|
||||
### Step 5: Test and Deploy
|
||||
|
||||
```bash
|
||||
# Test locally
|
||||
wrangler dev
|
||||
|
||||
# Test with MCP Inspector
|
||||
npx @modelcontextprotocol/inspector@latest
|
||||
# Connect to: http://localhost:8792/mcp
|
||||
|
||||
# Deploy to production
|
||||
wrangler deploy
|
||||
```
|
||||
|
||||
## 🔍 Key Files to Understand
|
||||
|
||||
To fully understand this use case, examine these files:
|
||||
|
||||
### Context Engineering Components
|
||||
|
||||
- **`PRPs/templates/prp_mcp_base.md`** - Specialized MCP PRP template
|
||||
- **`.claude/commands/prp-mcp-create.md`** - MCP-specific PRP generation
|
||||
- **`.claude/commands/prp-mcp-execute.md`** - MCP-specific execution
|
||||
|
||||
### Implementation Patterns
|
||||
|
||||
- **`src/index.ts`** - Complete MCP server with authentication
|
||||
- **`examples/database-tools.ts`** - Tool creation and registration patterns
|
||||
- **`src/tools/register-tools.ts`** - Modular tool registration system
|
||||
|
||||
### Configuration & Deployment
|
||||
|
||||
- **`wrangler.jsonc`** - Cloudflare Workers configuration
|
||||
- **`.dev.vars.example`** - Environment variable template
|
||||
- **`CLAUDE.md`** - Implementation guidelines and patterns
|
||||
|
||||
## 📈 Success Metrics
|
||||
|
||||
When you successfully use this process, you'll achieve:
|
||||
|
||||
- **Fast Implementation** - Quickly have an MCP Server with minimal iterations
|
||||
- **Production Ready** - Secure authentication, monitoring, and error handling
|
||||
- **Scalable Architecture** - Clean separation of concerns and modular design
|
||||
- **Comprehensive Testing** - Validation from TypeScript to production deployment
|
||||
|
||||
## 🤝 Contributing
|
||||
|
||||
This use case demonstrates the power of Context Engineering for complex software development. To improve it:
|
||||
|
||||
1. **Add new MCP server examples** to show different patterns
|
||||
2. **Enhance the PRP templates** with more comprehensive context
|
||||
3. **Improve validation loops** for better error detection
|
||||
4. **Document edge cases** and common pitfalls
|
||||
|
||||
The goal is to make MCP server development predictable and successful through comprehensive context engineering.
|
||||
|
||||
---
|
||||
|
||||
**Ready to build your MCP server?** Start by editing `PRPs/INITIAL.md` and run `/prp-mcp-create INITIAL.md` to generate your comprehensive implementation plan.
|
||||
235
use-cases/mcp-server/examples/database-tools-sentry.ts
Normal file
235
use-cases/mcp-server/examples/database-tools-sentry.ts
Normal file
@ -0,0 +1,235 @@
|
||||
import * as Sentry from "@sentry/cloudflare";
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import {
|
||||
Props,
|
||||
ListTablesSchema,
|
||||
QueryDatabaseSchema,
|
||||
ExecuteDatabaseSchema,
|
||||
createErrorResponse,
|
||||
createSuccessResponse
|
||||
} from "../types";
|
||||
import { validateSqlQuery, isWriteOperation, formatDatabaseError } from "../database/security";
|
||||
import { withDatabase } from "../database/utils";
|
||||
|
||||
const ALLOWED_USERNAMES = new Set<string>([
|
||||
// Add GitHub usernames of users who should have access to database write operations
|
||||
// For example: 'yourusername', 'coworkerusername'
|
||||
'coleam00'
|
||||
]);
|
||||
|
||||
// Error handling helper for MCP tools with Sentry
|
||||
function handleError(error: unknown): { content: Array<{ type: "text"; text: string; isError?: boolean }> } {
|
||||
const eventId = Sentry.captureException(error);
|
||||
|
||||
const errorMessage = [
|
||||
"**Error**",
|
||||
"There was a problem with your request.",
|
||||
"Please report the following to the support team:",
|
||||
`**Event ID**: ${eventId}`,
|
||||
process.env.NODE_ENV !== "production"
|
||||
? error instanceof Error
|
||||
? error.message
|
||||
: String(error)
|
||||
: "",
|
||||
].join("\n\n");
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: errorMessage,
|
||||
isError: true,
|
||||
},
|
||||
],
|
||||
};
|
||||
}
|
||||
|
||||
export function registerDatabaseToolsWithSentry(server: McpServer, env: Env, props: Props) {
|
||||
// Tool 1: List Tables - Available to all authenticated users
|
||||
server.tool(
|
||||
"listTables",
|
||||
"Get a list of all tables in the database along with their column information. Use this first to understand the database structure before querying.",
|
||||
ListTablesSchema,
|
||||
async () => {
|
||||
return await Sentry.startNewTrace(async () => {
|
||||
return await Sentry.startSpan({
|
||||
name: "mcp.tool/listTables",
|
||||
attributes: {
|
||||
'mcp.tool.name': 'listTables',
|
||||
'mcp.user.login': props.login,
|
||||
},
|
||||
}, async (span) => {
|
||||
// Set user context
|
||||
Sentry.setUser({
|
||||
username: props.login,
|
||||
email: props.email,
|
||||
});
|
||||
|
||||
try {
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
// Single query to get all table and column information (using your working query)
|
||||
const columns = await db.unsafe(`
|
||||
SELECT
|
||||
table_name,
|
||||
column_name,
|
||||
data_type,
|
||||
is_nullable,
|
||||
column_default
|
||||
FROM information_schema.columns
|
||||
WHERE table_schema = 'public'
|
||||
ORDER BY table_name, ordinal_position
|
||||
`);
|
||||
|
||||
// Group columns by table
|
||||
const tableMap = new Map();
|
||||
for (const col of columns) {
|
||||
// Use snake_case property names as returned by the SQL query
|
||||
if (!tableMap.has(col.table_name)) {
|
||||
tableMap.set(col.table_name, {
|
||||
name: col.table_name,
|
||||
schema: 'public',
|
||||
columns: []
|
||||
});
|
||||
}
|
||||
tableMap.get(col.table_name).columns.push({
|
||||
name: col.column_name,
|
||||
type: col.data_type,
|
||||
nullable: col.is_nullable === 'YES',
|
||||
default: col.column_default
|
||||
});
|
||||
}
|
||||
|
||||
const tableInfo = Array.from(tableMap.values());
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Database Tables and Schema**\n\n${JSON.stringify(tableInfo, null, 2)}\n\n**Total tables found:** ${tableInfo.length}\n\n**Note:** Use the \`queryDatabase\` tool to run SELECT queries, or \`executeDatabase\` tool for write operations (if you have write access).`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('listTables error:', error);
|
||||
span.setStatus({ code: 2 }); // error
|
||||
return handleError(error);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 2: Query Database - Available to all authenticated users (read-only)
|
||||
server.tool(
|
||||
"queryDatabase",
|
||||
"Execute a read-only SQL query against the PostgreSQL database. This tool only allows SELECT statements and other read operations. All authenticated users can use this tool.",
|
||||
QueryDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
return await Sentry.startNewTrace(async () => {
|
||||
return await Sentry.startSpan({
|
||||
name: "mcp.tool/queryDatabase",
|
||||
attributes: {
|
||||
'mcp.tool.name': 'queryDatabase',
|
||||
'mcp.user.login': props.login,
|
||||
'mcp.sql.query': sql.substring(0, 100), // Truncate for security
|
||||
},
|
||||
}, async (span) => {
|
||||
// Set user context
|
||||
Sentry.setUser({
|
||||
username: props.login,
|
||||
email: props.email,
|
||||
});
|
||||
|
||||
try {
|
||||
// Validate the SQL query
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(`Invalid SQL query: ${validation.error}`);
|
||||
}
|
||||
|
||||
// Check if it's a write operation
|
||||
if (isWriteOperation(sql)) {
|
||||
return createErrorResponse(
|
||||
"Write operations are not allowed with this tool. Use the `executeDatabase` tool if you have write permissions (requires special GitHub username access)."
|
||||
);
|
||||
}
|
||||
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
const results = await db.unsafe(sql);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Query Results**\n\`\`\`sql\n${sql}\n\`\`\`\n\n**Results:**\n\`\`\`json\n${JSON.stringify(results, null, 2)}\n\`\`\`\n\n**Rows returned:** ${Array.isArray(results) ? results.length : 1}`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('queryDatabase error:', error);
|
||||
span.setStatus({ code: 2 }); // error
|
||||
return handleError(error);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 3: Execute Database - Only available to privileged users (write operations)
|
||||
if (ALLOWED_USERNAMES.has(props.login)) {
|
||||
server.tool(
|
||||
"executeDatabase",
|
||||
"Execute any SQL statement against the PostgreSQL database, including INSERT, UPDATE, DELETE, and DDL operations. This tool is restricted to specific GitHub users and can perform write transactions. **USE WITH CAUTION** - this can modify or delete data.",
|
||||
ExecuteDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
return await Sentry.startNewTrace(async () => {
|
||||
return await Sentry.startSpan({
|
||||
name: "mcp.tool/executeDatabase",
|
||||
attributes: {
|
||||
'mcp.tool.name': 'executeDatabase',
|
||||
'mcp.user.login': props.login,
|
||||
'mcp.sql.query': sql.substring(0, 100), // Truncate for security
|
||||
'mcp.sql.is_write': isWriteOperation(sql),
|
||||
},
|
||||
}, async (span) => {
|
||||
// Set user context
|
||||
Sentry.setUser({
|
||||
username: props.login,
|
||||
email: props.email,
|
||||
});
|
||||
|
||||
try {
|
||||
// Validate the SQL query
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(`Invalid SQL statement: ${validation.error}`);
|
||||
}
|
||||
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
const results = await db.unsafe(sql);
|
||||
|
||||
const isWrite = isWriteOperation(sql);
|
||||
const operationType = isWrite ? "Write Operation" : "Read Operation";
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**${operationType} Executed Successfully**\n\`\`\`sql\n${sql}\n\`\`\`\n\n**Results:**\n\`\`\`json\n${JSON.stringify(results, null, 2)}\n\`\`\`\n\n${isWrite ? '**⚠️ Database was modified**' : `**Rows returned:** ${Array.isArray(results) ? results.length : 1}`}\n\n**Executed by:** ${props.login} (${props.name})`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('executeDatabase error:', error);
|
||||
span.setStatus({ code: 2 }); // error
|
||||
return handleError(error);
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
155
use-cases/mcp-server/examples/database-tools.ts
Normal file
155
use-cases/mcp-server/examples/database-tools.ts
Normal file
@ -0,0 +1,155 @@
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import {
|
||||
Props,
|
||||
ListTablesSchema,
|
||||
QueryDatabaseSchema,
|
||||
ExecuteDatabaseSchema,
|
||||
createErrorResponse,
|
||||
createSuccessResponse
|
||||
} from "../types";
|
||||
import { validateSqlQuery, isWriteOperation, formatDatabaseError } from "../database/security";
|
||||
import { withDatabase } from "../database/utils";
|
||||
|
||||
const ALLOWED_USERNAMES = new Set<string>([
|
||||
// Add GitHub usernames of users who should have access to database write operations
|
||||
// For example: 'yourusername', 'coworkerusername'
|
||||
'coleam00'
|
||||
]);
|
||||
|
||||
export function registerDatabaseTools(server: McpServer, env: Env, props: Props) {
|
||||
// Tool 1: List Tables - Available to all authenticated users
|
||||
server.tool(
|
||||
"listTables",
|
||||
"Get a list of all tables in the database along with their column information. Use this first to understand the database structure before querying.",
|
||||
ListTablesSchema,
|
||||
async () => {
|
||||
try {
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
// Single query to get all table and column information (using your working query)
|
||||
const columns = await db.unsafe(`
|
||||
SELECT
|
||||
table_name,
|
||||
column_name,
|
||||
data_type,
|
||||
is_nullable,
|
||||
column_default
|
||||
FROM information_schema.columns
|
||||
WHERE table_schema = 'public'
|
||||
ORDER BY table_name, ordinal_position
|
||||
`);
|
||||
|
||||
// Group columns by table
|
||||
const tableMap = new Map();
|
||||
for (const col of columns) {
|
||||
// Use snake_case property names as returned by the SQL query
|
||||
if (!tableMap.has(col.table_name)) {
|
||||
tableMap.set(col.table_name, {
|
||||
name: col.table_name,
|
||||
schema: 'public',
|
||||
columns: []
|
||||
});
|
||||
}
|
||||
tableMap.get(col.table_name).columns.push({
|
||||
name: col.column_name,
|
||||
type: col.data_type,
|
||||
nullable: col.is_nullable === 'YES',
|
||||
default: col.column_default
|
||||
});
|
||||
}
|
||||
|
||||
const tableInfo = Array.from(tableMap.values());
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Database Tables and Schema**\n\n${JSON.stringify(tableInfo, null, 2)}\n\n**Total tables found:** ${tableInfo.length}\n\n**Note:** Use the \`queryDatabase\` tool to run SELECT queries, or \`executeDatabase\` tool for write operations (if you have write access).`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('listTables error:', error);
|
||||
return createErrorResponse(
|
||||
`Error retrieving database schema: ${formatDatabaseError(error)}`
|
||||
);
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 2: Query Database - Available to all authenticated users (read-only)
|
||||
server.tool(
|
||||
"queryDatabase",
|
||||
"Execute a read-only SQL query against the PostgreSQL database. This tool only allows SELECT statements and other read operations. All authenticated users can use this tool.",
|
||||
QueryDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
try {
|
||||
// Validate the SQL query
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(`Invalid SQL query: ${validation.error}`);
|
||||
}
|
||||
|
||||
// Check if it's a write operation
|
||||
if (isWriteOperation(sql)) {
|
||||
return createErrorResponse(
|
||||
"Write operations are not allowed with this tool. Use the `executeDatabase` tool if you have write permissions (requires special GitHub username access)."
|
||||
);
|
||||
}
|
||||
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
const results = await db.unsafe(sql);
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**Query Results**\n\`\`\`sql\n${sql}\n\`\`\`\n\n**Results:**\n\`\`\`json\n${JSON.stringify(results, null, 2)}\n\`\`\`\n\n**Rows returned:** ${Array.isArray(results) ? results.length : 1}`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('queryDatabase error:', error);
|
||||
return createErrorResponse(`Database query error: ${formatDatabaseError(error)}`);
|
||||
}
|
||||
}
|
||||
);
|
||||
|
||||
// Tool 3: Execute Database - Only available to privileged users (write operations)
|
||||
if (ALLOWED_USERNAMES.has(props.login)) {
|
||||
server.tool(
|
||||
"executeDatabase",
|
||||
"Execute any SQL statement against the PostgreSQL database, including INSERT, UPDATE, DELETE, and DDL operations. This tool is restricted to specific GitHub users and can perform write transactions. **USE WITH CAUTION** - this can modify or delete data.",
|
||||
ExecuteDatabaseSchema,
|
||||
async ({ sql }) => {
|
||||
try {
|
||||
// Validate the SQL query
|
||||
const validation = validateSqlQuery(sql);
|
||||
if (!validation.isValid) {
|
||||
return createErrorResponse(`Invalid SQL statement: ${validation.error}`);
|
||||
}
|
||||
|
||||
return await withDatabase((env as any).DATABASE_URL, async (db) => {
|
||||
const results = await db.unsafe(sql);
|
||||
|
||||
const isWrite = isWriteOperation(sql);
|
||||
const operationType = isWrite ? "Write Operation" : "Read Operation";
|
||||
|
||||
return {
|
||||
content: [
|
||||
{
|
||||
type: "text",
|
||||
text: `**${operationType} Executed Successfully**\n\`\`\`sql\n${sql}\n\`\`\`\n\n**Results:**\n\`\`\`json\n${JSON.stringify(results, null, 2)}\n\`\`\`\n\n${isWrite ? '**⚠️ Database was modified**' : `**Rows returned:** ${Array.isArray(results) ? results.length : 1}`}\n\n**Executed by:** ${props.login} (${props.name})`
|
||||
}
|
||||
]
|
||||
};
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('executeDatabase error:', error);
|
||||
return createErrorResponse(`Database execution error: ${formatDatabaseError(error)}`);
|
||||
}
|
||||
}
|
||||
);
|
||||
}
|
||||
}
|
||||
5859
use-cases/mcp-server/package-lock.json
generated
Normal file
5859
use-cases/mcp-server/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
38
use-cases/mcp-server/package.json
Normal file
38
use-cases/mcp-server/package.json
Normal file
@ -0,0 +1,38 @@
|
||||
{
|
||||
"name": "remote-mcp-github-oauth",
|
||||
"version": "0.0.1",
|
||||
"private": true,
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"deploy": "wrangler deploy",
|
||||
"dev": "wrangler dev",
|
||||
"start": "wrangler dev",
|
||||
"cf-typegen": "wrangler types",
|
||||
"type-check": "tsc --noEmit",
|
||||
"test": "vitest",
|
||||
"test:ui": "vitest --ui",
|
||||
"test:run": "vitest run"
|
||||
},
|
||||
"dependencies": {
|
||||
"@cloudflare/workers-oauth-provider": "^0.0.5",
|
||||
"@modelcontextprotocol/sdk": "1.13.1",
|
||||
"@sentry/cloudflare": "^9.16.0",
|
||||
"agents": "^0.0.100",
|
||||
"hono": "^4.8.3",
|
||||
"just-pick": "^4.2.0",
|
||||
"octokit": "^5.0.3",
|
||||
"postgres": "^3.4.5",
|
||||
"workers-mcp": "^0.0.13",
|
||||
"zod": "^3.25.67"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@cloudflare/vitest-pool-workers": "^0.8.53",
|
||||
"@types/node": "^24.0.10",
|
||||
"@vitest/ui": "^3.2.4",
|
||||
"prettier": "^3.6.2",
|
||||
"typescript": "^5.8.3",
|
||||
"vi-fetch": "^0.8.0",
|
||||
"vitest": "^3.2.4",
|
||||
"wrangler": "^4.23.0"
|
||||
}
|
||||
}
|
||||
119
use-cases/mcp-server/src/auth/github-handler.ts
Normal file
119
use-cases/mcp-server/src/auth/github-handler.ts
Normal file
@ -0,0 +1,119 @@
|
||||
// import { env } from "cloudflare:workers";
|
||||
import type { AuthRequest } from "@cloudflare/workers-oauth-provider";
|
||||
import { Hono } from "hono";
|
||||
import { Octokit } from "octokit";
|
||||
import type { Props, ExtendedEnv } from "../types";
|
||||
import {
|
||||
clientIdAlreadyApproved,
|
||||
parseRedirectApproval,
|
||||
renderApprovalDialog,
|
||||
fetchUpstreamAuthToken,
|
||||
getUpstreamAuthorizeUrl,
|
||||
} from "./oauth-utils";
|
||||
const app = new Hono<{ Bindings: ExtendedEnv }>();
|
||||
|
||||
app.get("/authorize", async (c) => {
|
||||
const oauthReqInfo = await c.env.OAUTH_PROVIDER.parseAuthRequest(c.req.raw);
|
||||
const { clientId } = oauthReqInfo;
|
||||
if (!clientId) {
|
||||
return c.text("Invalid request", 400);
|
||||
}
|
||||
|
||||
if (
|
||||
await clientIdAlreadyApproved(c.req.raw, oauthReqInfo.clientId, (c.env as any).COOKIE_ENCRYPTION_KEY)
|
||||
) {
|
||||
return redirectToGithub(c.req.raw, oauthReqInfo, c.env, {});
|
||||
}
|
||||
|
||||
return renderApprovalDialog(c.req.raw, {
|
||||
client: await c.env.OAUTH_PROVIDER.lookupClient(clientId),
|
||||
server: {
|
||||
description: "This is a demo MCP Remote Server using GitHub for authentication.",
|
||||
logo: "https://avatars.githubusercontent.com/u/314135?s=200&v=4",
|
||||
name: "Cloudflare GitHub MCP Server", // optional
|
||||
},
|
||||
state: { oauthReqInfo }, // arbitrary data that flows through the form submission below
|
||||
});
|
||||
});
|
||||
|
||||
app.post("/authorize", async (c) => {
|
||||
// Validates form submission, extracts state, and generates Set-Cookie headers to skip approval dialog next time
|
||||
const { state, headers } = await parseRedirectApproval(c.req.raw, (c.env as any).COOKIE_ENCRYPTION_KEY);
|
||||
if (!state.oauthReqInfo) {
|
||||
return c.text("Invalid request", 400);
|
||||
}
|
||||
|
||||
return redirectToGithub(c.req.raw, state.oauthReqInfo, c.env, headers);
|
||||
});
|
||||
|
||||
async function redirectToGithub(
|
||||
request: Request,
|
||||
oauthReqInfo: AuthRequest,
|
||||
env: Env,
|
||||
headers: Record<string, string> = {},
|
||||
) {
|
||||
return new Response(null, {
|
||||
headers: {
|
||||
...headers,
|
||||
location: getUpstreamAuthorizeUrl({
|
||||
client_id: (env as any).GITHUB_CLIENT_ID,
|
||||
redirect_uri: new URL("/callback", request.url).href,
|
||||
scope: "read:user",
|
||||
state: btoa(JSON.stringify(oauthReqInfo)),
|
||||
upstream_url: "https://github.com/login/oauth/authorize",
|
||||
}),
|
||||
},
|
||||
status: 302,
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* OAuth Callback Endpoint
|
||||
*
|
||||
* This route handles the callback from GitHub after user authentication.
|
||||
* It exchanges the temporary code for an access token, then stores some
|
||||
* user metadata & the auth token as part of the 'props' on the token passed
|
||||
* down to the client. It ends by redirecting the client back to _its_ callback URL
|
||||
*/
|
||||
app.get("/callback", async (c) => {
|
||||
// Get the oathReqInfo out of KV
|
||||
const oauthReqInfo = JSON.parse(atob(c.req.query("state") as string)) as AuthRequest;
|
||||
if (!oauthReqInfo.clientId) {
|
||||
return c.text("Invalid state", 400);
|
||||
}
|
||||
|
||||
// Exchange the code for an access token
|
||||
const [accessToken, errResponse] = await fetchUpstreamAuthToken({
|
||||
client_id: (c.env as any).GITHUB_CLIENT_ID,
|
||||
client_secret: (c.env as any).GITHUB_CLIENT_SECRET,
|
||||
code: c.req.query("code"),
|
||||
redirect_uri: new URL("/callback", c.req.url).href,
|
||||
upstream_url: "https://github.com/login/oauth/access_token",
|
||||
});
|
||||
if (errResponse) return errResponse;
|
||||
|
||||
// Fetch the user info from GitHub
|
||||
const user = await new Octokit({ auth: accessToken }).rest.users.getAuthenticated();
|
||||
const { login, name, email } = user.data;
|
||||
|
||||
// Return back to the MCP client a new token
|
||||
const { redirectTo } = await c.env.OAUTH_PROVIDER.completeAuthorization({
|
||||
metadata: {
|
||||
label: name,
|
||||
},
|
||||
// This will be available on this.props inside MyMCP
|
||||
props: {
|
||||
accessToken,
|
||||
email,
|
||||
login,
|
||||
name,
|
||||
} as Props,
|
||||
request: oauthReqInfo,
|
||||
scope: oauthReqInfo.scope,
|
||||
userId: login,
|
||||
});
|
||||
|
||||
return Response.redirect(redirectTo);
|
||||
});
|
||||
|
||||
export { app as GitHubHandler };
|
||||
662
use-cases/mcp-server/src/auth/oauth-utils.ts
Normal file
662
use-cases/mcp-server/src/auth/oauth-utils.ts
Normal file
@ -0,0 +1,662 @@
|
||||
// OAuth utilities for cookie-based approval and upstream OAuth flows
|
||||
|
||||
import type {
|
||||
AuthRequest,
|
||||
ClientInfo,
|
||||
ApprovalDialogOptions,
|
||||
ParsedApprovalResult,
|
||||
UpstreamAuthorizeParams,
|
||||
UpstreamTokenParams
|
||||
} from "../types";
|
||||
|
||||
const COOKIE_NAME = "mcp-approved-clients";
|
||||
const ONE_YEAR_IN_SECONDS = 31536000;
|
||||
|
||||
// --- Helper Functions ---
|
||||
|
||||
/**
|
||||
* Encodes arbitrary data to a URL-safe base64 string.
|
||||
* @param data - The data to encode (will be stringified).
|
||||
* @returns A URL-safe base64 encoded string.
|
||||
*/
|
||||
function _encodeState(data: any): string {
|
||||
try {
|
||||
const jsonString = JSON.stringify(data);
|
||||
// Use btoa for simplicity, assuming Worker environment supports it well enough
|
||||
// For complex binary data, a Buffer/Uint8Array approach might be better
|
||||
return btoa(jsonString);
|
||||
} catch (e) {
|
||||
console.error("Error encoding state:", e);
|
||||
throw new Error("Could not encode state");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Decodes a URL-safe base64 string back to its original data.
|
||||
* @param encoded - The URL-safe base64 encoded string.
|
||||
* @returns The original data.
|
||||
*/
|
||||
function decodeState<T = any>(encoded: string): T {
|
||||
try {
|
||||
const jsonString = atob(encoded);
|
||||
return JSON.parse(jsonString);
|
||||
} catch (e) {
|
||||
console.error("Error decoding state:", e);
|
||||
throw new Error("Could not decode state");
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Imports a secret key string for HMAC-SHA256 signing.
|
||||
* @param secret - The raw secret key string.
|
||||
* @returns A promise resolving to the CryptoKey object.
|
||||
*/
|
||||
async function importKey(secret: string): Promise<CryptoKey> {
|
||||
if (!secret) {
|
||||
throw new Error(
|
||||
"COOKIE_SECRET is not defined. A secret key is required for signing cookies.",
|
||||
);
|
||||
}
|
||||
const enc = new TextEncoder();
|
||||
return crypto.subtle.importKey(
|
||||
"raw",
|
||||
enc.encode(secret),
|
||||
{ hash: "SHA-256", name: "HMAC" },
|
||||
false, // not extractable
|
||||
["sign", "verify"], // key usages
|
||||
);
|
||||
}
|
||||
|
||||
/**
|
||||
* Signs data using HMAC-SHA256.
|
||||
* @param key - The CryptoKey for signing.
|
||||
* @param data - The string data to sign.
|
||||
* @returns A promise resolving to the signature as a hex string.
|
||||
*/
|
||||
async function signData(key: CryptoKey, data: string): Promise<string> {
|
||||
const enc = new TextEncoder();
|
||||
const signatureBuffer = await crypto.subtle.sign("HMAC", key, enc.encode(data));
|
||||
// Convert ArrayBuffer to hex string
|
||||
return Array.from(new Uint8Array(signatureBuffer))
|
||||
.map((b) => b.toString(16).padStart(2, "0"))
|
||||
.join("");
|
||||
}
|
||||
|
||||
/**
|
||||
* Verifies an HMAC-SHA256 signature.
|
||||
* @param key - The CryptoKey for verification.
|
||||
* @param signatureHex - The signature to verify (hex string).
|
||||
* @param data - The original data that was signed.
|
||||
* @returns A promise resolving to true if the signature is valid, false otherwise.
|
||||
*/
|
||||
async function verifySignature(
|
||||
key: CryptoKey,
|
||||
signatureHex: string,
|
||||
data: string,
|
||||
): Promise<boolean> {
|
||||
const enc = new TextEncoder();
|
||||
try {
|
||||
// Convert hex signature back to ArrayBuffer
|
||||
const signatureBytes = new Uint8Array(
|
||||
signatureHex.match(/.{1,2}/g)!.map((byte) => Number.parseInt(byte, 16)),
|
||||
);
|
||||
return await crypto.subtle.verify("HMAC", key, signatureBytes.buffer, enc.encode(data));
|
||||
} catch (e) {
|
||||
// Handle errors during hex parsing or verification
|
||||
console.error("Error verifying signature:", e);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Parses the signed cookie and verifies its integrity.
|
||||
* @param cookieHeader - The value of the Cookie header from the request.
|
||||
* @param secret - The secret key used for signing.
|
||||
* @returns A promise resolving to the list of approved client IDs if the cookie is valid, otherwise null.
|
||||
*/
|
||||
async function getApprovedClientsFromCookie(
|
||||
cookieHeader: string | null,
|
||||
secret: string,
|
||||
): Promise<string[] | null> {
|
||||
if (!cookieHeader) return null;
|
||||
|
||||
const cookies = cookieHeader.split(";").map((c) => c.trim());
|
||||
const targetCookie = cookies.find((c) => c.startsWith(`${COOKIE_NAME}=`));
|
||||
|
||||
if (!targetCookie) return null;
|
||||
|
||||
const cookieValue = targetCookie.substring(COOKIE_NAME.length + 1);
|
||||
const parts = cookieValue.split(".");
|
||||
|
||||
if (parts.length !== 2) {
|
||||
console.warn("Invalid cookie format received.");
|
||||
return null; // Invalid format
|
||||
}
|
||||
|
||||
const [signatureHex, base64Payload] = parts;
|
||||
const payload = atob(base64Payload); // Assuming payload is base64 encoded JSON string
|
||||
|
||||
const key = await importKey(secret);
|
||||
const isValid = await verifySignature(key, signatureHex, payload);
|
||||
|
||||
if (!isValid) {
|
||||
console.warn("Cookie signature verification failed.");
|
||||
return null; // Signature invalid
|
||||
}
|
||||
|
||||
try {
|
||||
const approvedClients = JSON.parse(payload);
|
||||
if (!Array.isArray(approvedClients)) {
|
||||
console.warn("Cookie payload is not an array.");
|
||||
return null; // Payload isn't an array
|
||||
}
|
||||
// Ensure all elements are strings
|
||||
if (!approvedClients.every((item) => typeof item === "string")) {
|
||||
console.warn("Cookie payload contains non-string elements.");
|
||||
return null;
|
||||
}
|
||||
return approvedClients as string[];
|
||||
} catch (e) {
|
||||
console.error("Error parsing cookie payload:", e);
|
||||
return null; // JSON parsing failed
|
||||
}
|
||||
}
|
||||
|
||||
// --- Exported Functions ---
|
||||
|
||||
/**
|
||||
* Checks if a given client ID has already been approved by the user,
|
||||
* based on a signed cookie.
|
||||
*
|
||||
* @param request - The incoming Request object to read cookies from.
|
||||
* @param clientId - The OAuth client ID to check approval for.
|
||||
* @param cookieSecret - The secret key used to sign/verify the approval cookie.
|
||||
* @returns A promise resolving to true if the client ID is in the list of approved clients in a valid cookie, false otherwise.
|
||||
*/
|
||||
export async function clientIdAlreadyApproved(
|
||||
request: Request,
|
||||
clientId: string,
|
||||
cookieSecret: string,
|
||||
): Promise<boolean> {
|
||||
if (!clientId) return false;
|
||||
const cookieHeader = request.headers.get("Cookie");
|
||||
const approvedClients = await getApprovedClientsFromCookie(cookieHeader, cookieSecret);
|
||||
|
||||
return approvedClients?.includes(clientId) ?? false;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Renders an approval dialog for OAuth authorization
|
||||
* The dialog displays information about the client and server
|
||||
* and includes a form to submit approval
|
||||
*
|
||||
* @param request - The HTTP request
|
||||
* @param options - Configuration for the approval dialog
|
||||
* @returns A Response containing the HTML approval dialog
|
||||
*/
|
||||
export function renderApprovalDialog(request: Request, options: ApprovalDialogOptions): Response {
|
||||
const { client, server, state } = options;
|
||||
|
||||
// Encode state for form submission
|
||||
const encodedState = btoa(JSON.stringify(state));
|
||||
|
||||
// Sanitize any untrusted content
|
||||
const serverName = sanitizeHtml(server.name);
|
||||
const clientName = client?.clientName ? sanitizeHtml(client.clientName) : "Unknown MCP Client";
|
||||
const serverDescription = server.description ? sanitizeHtml(server.description) : "";
|
||||
|
||||
// Safe URLs
|
||||
const logoUrl = server.logo ? sanitizeHtml(server.logo) : "";
|
||||
const clientUri = client?.clientUri ? sanitizeHtml(client.clientUri) : "";
|
||||
const policyUri = client?.policyUri ? sanitizeHtml(client.policyUri) : "";
|
||||
const tosUri = client?.tosUri ? sanitizeHtml(client.tosUri) : "";
|
||||
|
||||
// Client contacts
|
||||
const contacts =
|
||||
client?.contacts && client.contacts.length > 0
|
||||
? sanitizeHtml(client.contacts.join(", "))
|
||||
: "";
|
||||
|
||||
// Get redirect URIs
|
||||
const redirectUris =
|
||||
client?.redirectUris && client.redirectUris.length > 0
|
||||
? client.redirectUris.map((uri) => sanitizeHtml(uri))
|
||||
: [];
|
||||
|
||||
// Generate HTML for the approval dialog
|
||||
const htmlContent = `
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>${clientName} | Authorization Request</title>
|
||||
<style>
|
||||
/* Modern, responsive styling with system fonts */
|
||||
:root {
|
||||
--primary-color: #0070f3;
|
||||
--error-color: #f44336;
|
||||
--border-color: #e5e7eb;
|
||||
--text-color: #333;
|
||||
--background-color: #fff;
|
||||
--card-shadow: 0 8px 36px 8px rgba(0, 0, 0, 0.1);
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: -apple-system, BlinkMacSystemFont, "Segoe UI", Roboto,
|
||||
Helvetica, Arial, sans-serif, "Apple Color Emoji",
|
||||
"Segoe UI Emoji", "Segoe UI Symbol";
|
||||
line-height: 1.6;
|
||||
color: var(--text-color);
|
||||
background-color: #f9fafb;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.container {
|
||||
max-width: 600px;
|
||||
margin: 2rem auto;
|
||||
padding: 1rem;
|
||||
}
|
||||
|
||||
.precard {
|
||||
padding: 2rem;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.card {
|
||||
background-color: var(--background-color);
|
||||
border-radius: 8px;
|
||||
box-shadow: var(--card-shadow);
|
||||
padding: 2rem;
|
||||
}
|
||||
|
||||
.header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
|
||||
.logo {
|
||||
width: 48px;
|
||||
height: 48px;
|
||||
margin-right: 1rem;
|
||||
border-radius: 8px;
|
||||
object-fit: contain;
|
||||
}
|
||||
|
||||
.title {
|
||||
margin: 0;
|
||||
font-size: 1.3rem;
|
||||
font-weight: 400;
|
||||
}
|
||||
|
||||
.alert {
|
||||
margin: 0;
|
||||
font-size: 1.5rem;
|
||||
font-weight: 400;
|
||||
margin: 1rem 0;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.description {
|
||||
color: #555;
|
||||
}
|
||||
|
||||
.client-info {
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: 6px;
|
||||
padding: 1rem 1rem 0.5rem;
|
||||
margin-bottom: 1.5rem;
|
||||
}
|
||||
|
||||
.client-name {
|
||||
font-weight: 600;
|
||||
font-size: 1.2rem;
|
||||
margin: 0 0 0.5rem 0;
|
||||
}
|
||||
|
||||
.client-detail {
|
||||
display: flex;
|
||||
margin-bottom: 0.5rem;
|
||||
align-items: baseline;
|
||||
}
|
||||
|
||||
.detail-label {
|
||||
font-weight: 500;
|
||||
min-width: 120px;
|
||||
}
|
||||
|
||||
.detail-value {
|
||||
font-family: SFMono-Regular, Menlo, Monaco, Consolas, "Liberation Mono", "Courier New", monospace;
|
||||
word-break: break-all;
|
||||
}
|
||||
|
||||
.detail-value a {
|
||||
color: inherit;
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
.detail-value.small {
|
||||
font-size: 0.8em;
|
||||
}
|
||||
|
||||
.external-link-icon {
|
||||
font-size: 0.75em;
|
||||
margin-left: 0.25rem;
|
||||
vertical-align: super;
|
||||
}
|
||||
|
||||
.actions {
|
||||
display: flex;
|
||||
justify-content: flex-end;
|
||||
gap: 1rem;
|
||||
margin-top: 2rem;
|
||||
}
|
||||
|
||||
.button {
|
||||
padding: 0.75rem 1.5rem;
|
||||
border-radius: 6px;
|
||||
font-weight: 500;
|
||||
cursor: pointer;
|
||||
border: none;
|
||||
font-size: 1rem;
|
||||
}
|
||||
|
||||
.button-primary {
|
||||
background-color: var(--primary-color);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.button-secondary {
|
||||
background-color: transparent;
|
||||
border: 1px solid var(--border-color);
|
||||
color: var(--text-color);
|
||||
}
|
||||
|
||||
/* Responsive adjustments */
|
||||
@media (max-width: 640px) {
|
||||
.container {
|
||||
margin: 1rem auto;
|
||||
padding: 0.5rem;
|
||||
}
|
||||
|
||||
.card {
|
||||
padding: 1.5rem;
|
||||
}
|
||||
|
||||
.client-detail {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.detail-label {
|
||||
min-width: unset;
|
||||
margin-bottom: 0.25rem;
|
||||
}
|
||||
|
||||
.actions {
|
||||
flex-direction: column;
|
||||
}
|
||||
|
||||
.button {
|
||||
width: 100%;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div class="container">
|
||||
<div class="precard">
|
||||
<div class="header">
|
||||
${logoUrl ? `<img src="${logoUrl}" alt="${serverName} Logo" class="logo">` : ""}
|
||||
<h1 class="title"><strong>${serverName}</strong></h1>
|
||||
</div>
|
||||
|
||||
${serverDescription ? `<p class="description">${serverDescription}</p>` : ""}
|
||||
</div>
|
||||
|
||||
<div class="card">
|
||||
|
||||
<h2 class="alert"><strong>${clientName || "A new MCP Client"}</strong> is requesting access</h1>
|
||||
|
||||
<div class="client-info">
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Name:</div>
|
||||
<div class="detail-value">
|
||||
${clientName}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
${
|
||||
clientUri
|
||||
? `
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Website:</div>
|
||||
<div class="detail-value small">
|
||||
<a href="${clientUri}" target="_blank" rel="noopener noreferrer">
|
||||
${clientUri}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
`
|
||||
: ""
|
||||
}
|
||||
|
||||
${
|
||||
policyUri
|
||||
? `
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Privacy Policy:</div>
|
||||
<div class="detail-value">
|
||||
<a href="${policyUri}" target="_blank" rel="noopener noreferrer">
|
||||
${policyUri}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
`
|
||||
: ""
|
||||
}
|
||||
|
||||
${
|
||||
tosUri
|
||||
? `
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Terms of Service:</div>
|
||||
<div class="detail-value">
|
||||
<a href="${tosUri}" target="_blank" rel="noopener noreferrer">
|
||||
${tosUri}
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
`
|
||||
: ""
|
||||
}
|
||||
|
||||
${
|
||||
redirectUris.length > 0
|
||||
? `
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Redirect URIs:</div>
|
||||
<div class="detail-value small">
|
||||
${redirectUris.map((uri) => `<div>${uri}</div>`).join("")}
|
||||
</div>
|
||||
</div>
|
||||
`
|
||||
: ""
|
||||
}
|
||||
|
||||
${
|
||||
contacts
|
||||
? `
|
||||
<div class="client-detail">
|
||||
<div class="detail-label">Contact:</div>
|
||||
<div class="detail-value">${contacts}</div>
|
||||
</div>
|
||||
`
|
||||
: ""
|
||||
}
|
||||
</div>
|
||||
|
||||
<p>This MCP Client is requesting to be authorized on ${serverName}. If you approve, you will be redirected to complete authentication.</p>
|
||||
|
||||
<form method="post" action="${new URL(request.url).pathname}">
|
||||
<input type="hidden" name="state" value="${encodedState}">
|
||||
|
||||
<div class="actions">
|
||||
<button type="button" class="button button-secondary" onclick="window.history.back()">Cancel</button>
|
||||
<button type="submit" class="button button-primary">Approve</button>
|
||||
</div>
|
||||
</form>
|
||||
</div>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
`;
|
||||
|
||||
return new Response(htmlContent, {
|
||||
headers: {
|
||||
"Content-Type": "text/html; charset=utf-8",
|
||||
},
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Parses the form submission from the approval dialog, extracts the state,
|
||||
* and generates Set-Cookie headers to mark the client as approved.
|
||||
*
|
||||
* @param request - The incoming POST Request object containing the form data.
|
||||
* @param cookieSecret - The secret key used to sign the approval cookie.
|
||||
* @returns A promise resolving to an object containing the parsed state and necessary headers.
|
||||
* @throws If the request method is not POST, form data is invalid, or state is missing.
|
||||
*/
|
||||
export async function parseRedirectApproval(
|
||||
request: Request,
|
||||
cookieSecret: string,
|
||||
): Promise<ParsedApprovalResult> {
|
||||
if (request.method !== "POST") {
|
||||
throw new Error("Invalid request method. Expected POST.");
|
||||
}
|
||||
|
||||
let state: any;
|
||||
let clientId: string | undefined;
|
||||
|
||||
try {
|
||||
const formData = await request.formData();
|
||||
const encodedState = formData.get("state");
|
||||
|
||||
if (typeof encodedState !== "string" || !encodedState) {
|
||||
throw new Error("Missing or invalid 'state' in form data.");
|
||||
}
|
||||
|
||||
state = decodeState<{ oauthReqInfo?: AuthRequest }>(encodedState); // Decode the state
|
||||
clientId = state?.oauthReqInfo?.clientId; // Extract clientId from within the state
|
||||
|
||||
if (!clientId) {
|
||||
throw new Error("Could not extract clientId from state object.");
|
||||
}
|
||||
} catch (e) {
|
||||
console.error("Error processing form submission:", e);
|
||||
// Rethrow or handle as appropriate, maybe return a specific error response
|
||||
throw new Error(
|
||||
`Failed to parse approval form: ${e instanceof Error ? e.message : String(e)}`,
|
||||
);
|
||||
}
|
||||
|
||||
// Get existing approved clients
|
||||
const cookieHeader = request.headers.get("Cookie");
|
||||
const existingApprovedClients =
|
||||
(await getApprovedClientsFromCookie(cookieHeader, cookieSecret)) || [];
|
||||
|
||||
// Add the newly approved client ID (avoid duplicates)
|
||||
const updatedApprovedClients = Array.from(new Set([...existingApprovedClients, clientId]));
|
||||
|
||||
// Sign the updated list
|
||||
const payload = JSON.stringify(updatedApprovedClients);
|
||||
const key = await importKey(cookieSecret);
|
||||
const signature = await signData(key, payload);
|
||||
const newCookieValue = `${signature}.${btoa(payload)}`; // signature.base64(payload)
|
||||
|
||||
// Generate Set-Cookie header
|
||||
const headers: Record<string, string> = {
|
||||
"Set-Cookie": `${COOKIE_NAME}=${newCookieValue}; HttpOnly; Secure; Path=/; SameSite=Lax; Max-Age=${ONE_YEAR_IN_SECONDS}`,
|
||||
};
|
||||
|
||||
return { headers, state };
|
||||
}
|
||||
|
||||
/**
|
||||
* Sanitizes HTML content to prevent XSS attacks
|
||||
* @param unsafe - The unsafe string that might contain HTML
|
||||
* @returns A safe string with HTML special characters escaped
|
||||
*/
|
||||
function sanitizeHtml(unsafe: string): string {
|
||||
return unsafe
|
||||
.replace(/&/g, "&")
|
||||
.replace(/</g, "<")
|
||||
.replace(/>/g, ">")
|
||||
.replace(/"/g, """)
|
||||
.replace(/'/g, "'");
|
||||
}
|
||||
|
||||
// --- OAuth Helper Functions ---
|
||||
|
||||
/**
|
||||
* Constructs an authorization URL for an upstream service.
|
||||
*
|
||||
* @param {UpstreamAuthorizeParams} options - The parameters for constructing the URL
|
||||
* @returns {string} The authorization URL.
|
||||
*/
|
||||
export function getUpstreamAuthorizeUrl({
|
||||
upstream_url,
|
||||
client_id,
|
||||
scope,
|
||||
redirect_uri,
|
||||
state,
|
||||
}: UpstreamAuthorizeParams): string {
|
||||
const upstream = new URL(upstream_url);
|
||||
upstream.searchParams.set("client_id", client_id);
|
||||
upstream.searchParams.set("redirect_uri", redirect_uri);
|
||||
upstream.searchParams.set("scope", scope);
|
||||
if (state) upstream.searchParams.set("state", state);
|
||||
upstream.searchParams.set("response_type", "code");
|
||||
return upstream.href;
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetches an authorization token from an upstream service.
|
||||
*
|
||||
* @param {UpstreamTokenParams} options - The parameters for the token exchange
|
||||
* @returns {Promise<[string, null] | [null, Response]>} A promise that resolves to an array containing the access token or an error response.
|
||||
*/
|
||||
export async function fetchUpstreamAuthToken({
|
||||
client_id,
|
||||
client_secret,
|
||||
code,
|
||||
redirect_uri,
|
||||
upstream_url,
|
||||
}: UpstreamTokenParams): Promise<[string, null] | [null, Response]> {
|
||||
if (!code) {
|
||||
return [null, new Response("Missing code", { status: 400 })];
|
||||
}
|
||||
|
||||
const resp = await fetch(upstream_url, {
|
||||
body: new URLSearchParams({ client_id, client_secret, code, redirect_uri }).toString(),
|
||||
headers: {
|
||||
"Content-Type": "application/x-www-form-urlencoded",
|
||||
},
|
||||
method: "POST",
|
||||
});
|
||||
if (!resp.ok) {
|
||||
console.log(await resp.text());
|
||||
return [null, new Response("Failed to fetch access token", { status: 500 })];
|
||||
}
|
||||
const body = await resp.formData();
|
||||
const accessToken = body.get("access_token") as string;
|
||||
if (!accessToken) {
|
||||
return [null, new Response("Missing access token", { status: 400 })];
|
||||
}
|
||||
return [accessToken, null];
|
||||
}
|
||||
37
use-cases/mcp-server/src/database/connection.ts
Normal file
37
use-cases/mcp-server/src/database/connection.ts
Normal file
@ -0,0 +1,37 @@
|
||||
import postgres from "postgres";
|
||||
|
||||
let dbInstance: postgres.Sql | null = null;
|
||||
|
||||
/**
|
||||
* Get database connection singleton
|
||||
* Following the pattern from BASIC-DB-MCP.md but adapted for PostgreSQL with connection pooling
|
||||
*/
|
||||
export function getDb(databaseUrl: string): postgres.Sql {
|
||||
if (!dbInstance) {
|
||||
dbInstance = postgres(databaseUrl, {
|
||||
// Connection pool settings for Cloudflare Workers
|
||||
max: 5, // Maximum 5 connections to fit within Workers' limit of 6 concurrent connections
|
||||
idle_timeout: 20,
|
||||
connect_timeout: 10,
|
||||
// Enable prepared statements for better performance
|
||||
prepare: true,
|
||||
});
|
||||
}
|
||||
return dbInstance;
|
||||
}
|
||||
|
||||
/**
|
||||
* Close database connection pool
|
||||
* Call this when the Durable Object is shutting down
|
||||
*/
|
||||
export async function closeDb(): Promise<void> {
|
||||
if (dbInstance) {
|
||||
try {
|
||||
await dbInstance.end();
|
||||
} catch (error) {
|
||||
console.error('Error closing database connection:', error);
|
||||
} finally {
|
||||
dbInstance = null;
|
||||
}
|
||||
}
|
||||
}
|
||||
72
use-cases/mcp-server/src/database/security.ts
Normal file
72
use-cases/mcp-server/src/database/security.ts
Normal file
@ -0,0 +1,72 @@
|
||||
import type { SqlValidationResult } from "../types";
|
||||
|
||||
/**
|
||||
* SQL injection protection: Basic SQL keyword validation
|
||||
* This is a simple check - in production you should use parameterized queries
|
||||
*/
|
||||
export function validateSqlQuery(sql: string): SqlValidationResult {
|
||||
const trimmedSql = sql.trim().toLowerCase();
|
||||
|
||||
// Check for empty queries
|
||||
if (!trimmedSql) {
|
||||
return { isValid: false, error: "SQL query cannot be empty" };
|
||||
}
|
||||
|
||||
// Check for obviously dangerous patterns
|
||||
const dangerousPatterns = [
|
||||
/;\s*drop\s+/i,
|
||||
/^drop\s+/i, // DROP at start of query
|
||||
/;\s*delete\s+.*\s+where\s+1\s*=\s*1/i,
|
||||
/;\s*update\s+.*\s+set\s+.*\s+where\s+1\s*=\s*1/i,
|
||||
/;\s*truncate\s+/i,
|
||||
/^truncate\s+/i, // TRUNCATE at start of query
|
||||
/;\s*alter\s+/i,
|
||||
/^alter\s+/i, // ALTER at start of query
|
||||
/;\s*create\s+/i,
|
||||
/;\s*grant\s+/i,
|
||||
/;\s*revoke\s+/i,
|
||||
/xp_cmdshell/i,
|
||||
/sp_executesql/i,
|
||||
];
|
||||
|
||||
for (const pattern of dangerousPatterns) {
|
||||
if (pattern.test(sql)) {
|
||||
return { isValid: false, error: "Query contains potentially dangerous SQL patterns" };
|
||||
}
|
||||
}
|
||||
|
||||
return { isValid: true };
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a SQL query is a write operation
|
||||
*/
|
||||
export function isWriteOperation(sql: string): boolean {
|
||||
const trimmedSql = sql.trim().toLowerCase();
|
||||
const writeKeywords = [
|
||||
'insert', 'update', 'delete', 'create', 'drop', 'alter',
|
||||
'truncate', 'grant', 'revoke', 'commit', 'rollback'
|
||||
];
|
||||
|
||||
return writeKeywords.some(keyword => trimmedSql.startsWith(keyword));
|
||||
}
|
||||
|
||||
/**
|
||||
* Format database error for user-friendly display
|
||||
*/
|
||||
export function formatDatabaseError(error: unknown): string {
|
||||
if (error instanceof Error) {
|
||||
// Hide sensitive connection details
|
||||
if (error.message.includes('password')) {
|
||||
return "Database authentication failed. Please check your credentials.";
|
||||
}
|
||||
if (error.message.includes('timeout')) {
|
||||
return "Database connection timed out. Please try again.";
|
||||
}
|
||||
if (error.message.includes('connection') || error.message.includes('connect')) {
|
||||
return "Unable to connect to database. Please check your connection string.";
|
||||
}
|
||||
return `Database error: ${error.message}`;
|
||||
}
|
||||
return "An unknown database error occurred.";
|
||||
}
|
||||
27
use-cases/mcp-server/src/database/utils.ts
Normal file
27
use-cases/mcp-server/src/database/utils.ts
Normal file
@ -0,0 +1,27 @@
|
||||
import postgres from "postgres";
|
||||
import { getDb } from "./connection";
|
||||
|
||||
/**
|
||||
* Execute a database operation with proper connection management
|
||||
* Following the pattern from BASIC-DB-MCP.md but adapted for PostgreSQL
|
||||
*/
|
||||
export async function withDatabase<T>(
|
||||
databaseUrl: string,
|
||||
operation: (db: postgres.Sql) => Promise<T>
|
||||
): Promise<T> {
|
||||
const db = getDb(databaseUrl);
|
||||
const startTime = Date.now();
|
||||
try {
|
||||
const result = await operation(db);
|
||||
const duration = Date.now() - startTime;
|
||||
console.log(`Database operation completed successfully in ${duration}ms`);
|
||||
return result;
|
||||
} catch (error) {
|
||||
const duration = Date.now() - startTime;
|
||||
console.error(`Database operation failed after ${duration}ms:`, error);
|
||||
// Re-throw the error so it can be caught by Sentry in the calling code
|
||||
throw error;
|
||||
}
|
||||
// Note: With PostgreSQL connection pooling, we don't close individual connections
|
||||
// They're returned to the pool automatically. The pool is closed when the Durable Object shuts down.
|
||||
}
|
||||
49
use-cases/mcp-server/src/index.ts
Normal file
49
use-cases/mcp-server/src/index.ts
Normal file
@ -0,0 +1,49 @@
|
||||
import OAuthProvider from "@cloudflare/workers-oauth-provider";
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { McpAgent } from "agents/mcp";
|
||||
import { Props } from "./types";
|
||||
import { GitHubHandler } from "./auth/github-handler";
|
||||
import { closeDb } from "./database/connection";
|
||||
import { registerAllTools } from "./tools/register-tools";
|
||||
|
||||
export class MyMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
server = new McpServer({
|
||||
name: "PostgreSQL Database MCP Server",
|
||||
version: "1.0.0",
|
||||
});
|
||||
|
||||
/**
|
||||
* Cleanup database connections when Durable Object is shutting down
|
||||
*/
|
||||
async cleanup(): Promise<void> {
|
||||
try {
|
||||
await closeDb();
|
||||
console.log('Database connections closed successfully');
|
||||
} catch (error) {
|
||||
console.error('Error during database cleanup:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Durable Objects alarm handler - used for cleanup
|
||||
*/
|
||||
async alarm(): Promise<void> {
|
||||
await this.cleanup();
|
||||
}
|
||||
|
||||
async init() {
|
||||
// Register all tools based on user permissions
|
||||
registerAllTools(this.server, this.env, this.props);
|
||||
}
|
||||
}
|
||||
|
||||
export default new OAuthProvider({
|
||||
apiHandlers: {
|
||||
'/sse': MyMCP.serveSSE('/sse') as any,
|
||||
'/mcp': MyMCP.serve('/mcp') as any,
|
||||
},
|
||||
authorizeEndpoint: "/authorize",
|
||||
clientRegistrationEndpoint: "/register",
|
||||
defaultHandler: GitHubHandler as any,
|
||||
tokenEndpoint: "/token",
|
||||
});
|
||||
68
use-cases/mcp-server/src/index_sentry.ts
Normal file
68
use-cases/mcp-server/src/index_sentry.ts
Normal file
@ -0,0 +1,68 @@
|
||||
import * as Sentry from "@sentry/cloudflare";
|
||||
import OAuthProvider from "@cloudflare/workers-oauth-provider";
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { McpAgent } from "agents/mcp";
|
||||
import { Props } from "./types";
|
||||
import { GitHubHandler } from "./auth/github-handler";
|
||||
import { closeDb } from "./database/connection";
|
||||
//@ts-ignore
|
||||
import { registerDatabaseToolsWithSentry } from "./tools/database-tools-sentry";
|
||||
|
||||
// Sentry configuration helper
|
||||
function getSentryConfig(env: Env) {
|
||||
return {
|
||||
// You can disable Sentry by setting SENTRY_DSN to a falsey-value
|
||||
dsn: (env as any).SENTRY_DSN,
|
||||
// A sample rate of 1.0 means "capture all traces"
|
||||
tracesSampleRate: 1,
|
||||
};
|
||||
}
|
||||
|
||||
export class MyMCP extends McpAgent<Env, Record<string, never>, Props> {
|
||||
server = new McpServer({
|
||||
name: "PostgreSQL Database MCP Server",
|
||||
version: "1.0.0",
|
||||
});
|
||||
|
||||
/**
|
||||
* Cleanup database connections when Durable Object is shutting down
|
||||
*/
|
||||
async cleanup(): Promise<void> {
|
||||
try {
|
||||
await closeDb();
|
||||
console.log('Database connections closed successfully');
|
||||
} catch (error) {
|
||||
console.error('Error during database cleanup:', error);
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Durable Objects alarm handler - used for cleanup
|
||||
*/
|
||||
async alarm(): Promise<void> {
|
||||
await this.cleanup();
|
||||
}
|
||||
|
||||
async init() {
|
||||
// Initialize Sentry
|
||||
const sentryConfig = getSentryConfig(this.env);
|
||||
if (sentryConfig.dsn) {
|
||||
// @ts-ignore - Sentry.init exists but types may not be complete
|
||||
Sentry.init(sentryConfig);
|
||||
}
|
||||
|
||||
// Register all tools with Sentry instrumentation
|
||||
registerDatabaseToolsWithSentry(this.server, this.env, this.props);
|
||||
}
|
||||
}
|
||||
|
||||
export default new OAuthProvider({
|
||||
apiHandlers: {
|
||||
'/sse': MyMCP.serveSSE('/sse') as any,
|
||||
'/mcp': MyMCP.serve('/mcp') as any,
|
||||
},
|
||||
authorizeEndpoint: "/authorize",
|
||||
clientRegistrationEndpoint: "/register",
|
||||
defaultHandler: GitHubHandler as any,
|
||||
tokenEndpoint: "/token",
|
||||
});
|
||||
14
use-cases/mcp-server/src/tools/register-tools.ts
Normal file
14
use-cases/mcp-server/src/tools/register-tools.ts
Normal file
@ -0,0 +1,14 @@
|
||||
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
|
||||
import { Props } from "../types";
|
||||
import { registerDatabaseTools } from "../../examples/database-tools";
|
||||
|
||||
/**
|
||||
* Register all MCP tools based on user permissions
|
||||
*/
|
||||
export function registerAllTools(server: McpServer, env: Env, props: Props) {
|
||||
// Register database tools
|
||||
registerDatabaseTools(server, env, props);
|
||||
|
||||
// Future tools can be registered here
|
||||
// registerOtherTools(server, env, props);
|
||||
}
|
||||
126
use-cases/mcp-server/src/types.ts
Normal file
126
use-cases/mcp-server/src/types.ts
Normal file
@ -0,0 +1,126 @@
|
||||
import { z } from "zod";
|
||||
import type { AuthRequest, OAuthHelpers, ClientInfo } from "@cloudflare/workers-oauth-provider";
|
||||
|
||||
// User context passed through OAuth
|
||||
export type Props = {
|
||||
login: string;
|
||||
name: string;
|
||||
email: string;
|
||||
accessToken: string;
|
||||
};
|
||||
|
||||
// Extended environment with OAuth provider
|
||||
export type ExtendedEnv = Env & { OAUTH_PROVIDER: OAuthHelpers };
|
||||
|
||||
// OAuth URL construction parameters
|
||||
export interface UpstreamAuthorizeParams {
|
||||
upstream_url: string;
|
||||
client_id: string;
|
||||
scope: string;
|
||||
redirect_uri: string;
|
||||
state?: string;
|
||||
}
|
||||
|
||||
// OAuth token exchange parameters
|
||||
export interface UpstreamTokenParams {
|
||||
code: string | undefined;
|
||||
upstream_url: string;
|
||||
client_secret: string;
|
||||
redirect_uri: string;
|
||||
client_id: string;
|
||||
}
|
||||
|
||||
// Approval dialog configuration
|
||||
export interface ApprovalDialogOptions {
|
||||
client: ClientInfo | null;
|
||||
server: {
|
||||
name: string;
|
||||
logo?: string;
|
||||
description?: string;
|
||||
};
|
||||
state: Record<string, any>;
|
||||
cookieName?: string;
|
||||
cookieSecret?: string | Uint8Array;
|
||||
cookieDomain?: string;
|
||||
cookiePath?: string;
|
||||
cookieMaxAge?: number;
|
||||
}
|
||||
|
||||
// Result of parsing approval form
|
||||
export interface ParsedApprovalResult {
|
||||
state: any;
|
||||
headers: Record<string, string>;
|
||||
}
|
||||
|
||||
// MCP tool schemas using Zod
|
||||
export const ListTablesSchema = {};
|
||||
|
||||
export const QueryDatabaseSchema = {
|
||||
sql: z
|
||||
.string()
|
||||
.min(1, "SQL query cannot be empty")
|
||||
.describe("SQL query to execute (SELECT queries only)"),
|
||||
};
|
||||
|
||||
export const ExecuteDatabaseSchema = {
|
||||
sql: z
|
||||
.string()
|
||||
.min(1, "SQL command cannot be empty")
|
||||
.describe("SQL command to execute (INSERT, UPDATE, DELETE, CREATE, etc.)"),
|
||||
};
|
||||
|
||||
// MCP response types
|
||||
export interface McpTextContent {
|
||||
type: "text";
|
||||
text: string;
|
||||
isError?: boolean;
|
||||
}
|
||||
|
||||
export interface McpResponse {
|
||||
content: McpTextContent[];
|
||||
}
|
||||
|
||||
// Standard response creators
|
||||
export function createSuccessResponse(message: string, data?: any): McpResponse {
|
||||
let text = `**Success**\n\n${message}`;
|
||||
if (data !== undefined) {
|
||||
text += `\n\n**Result:**\n\`\`\`json\n${JSON.stringify(data, null, 2)}\n\`\`\``;
|
||||
}
|
||||
return {
|
||||
content: [{
|
||||
type: "text",
|
||||
text,
|
||||
}],
|
||||
};
|
||||
}
|
||||
|
||||
export function createErrorResponse(message: string, details?: any): McpResponse {
|
||||
let text = `**Error**\n\n${message}`;
|
||||
if (details !== undefined) {
|
||||
text += `\n\n**Details:**\n\`\`\`json\n${JSON.stringify(details, null, 2)}\n\`\`\``;
|
||||
}
|
||||
return {
|
||||
content: [{
|
||||
type: "text",
|
||||
text,
|
||||
isError: true,
|
||||
}],
|
||||
};
|
||||
}
|
||||
|
||||
// Database operation result type
|
||||
export interface DatabaseOperationResult<T = any> {
|
||||
success: boolean;
|
||||
data?: T;
|
||||
error?: string;
|
||||
duration?: number;
|
||||
}
|
||||
|
||||
// SQL validation result
|
||||
export interface SqlValidationResult {
|
||||
isValid: boolean;
|
||||
error?: string;
|
||||
}
|
||||
|
||||
// Re-export external types that are used throughout
|
||||
export type { AuthRequest, OAuthHelpers, ClientInfo };
|
||||
45
use-cases/mcp-server/tests/fixtures/auth.fixtures.ts
vendored
Normal file
45
use-cases/mcp-server/tests/fixtures/auth.fixtures.ts
vendored
Normal file
@ -0,0 +1,45 @@
|
||||
import type { Props } from '../../src/types'
|
||||
|
||||
export const mockProps: Props = {
|
||||
login: 'testuser',
|
||||
name: 'Test User',
|
||||
email: 'test@example.com',
|
||||
accessToken: 'test-access-token',
|
||||
}
|
||||
|
||||
export const mockPrivilegedProps: Props = {
|
||||
login: 'coleam00',
|
||||
name: 'Cole Medin',
|
||||
email: 'cole@example.com',
|
||||
accessToken: 'privileged-access-token',
|
||||
}
|
||||
|
||||
export const mockGitHubUser = {
|
||||
data: {
|
||||
login: 'testuser',
|
||||
name: 'Test User',
|
||||
email: 'test@example.com',
|
||||
id: 12345,
|
||||
avatar_url: 'https://github.com/images/avatar.png',
|
||||
},
|
||||
}
|
||||
|
||||
export const mockAuthRequest = {
|
||||
clientId: 'test-client-id',
|
||||
redirectUri: 'http://localhost:3000/callback',
|
||||
scope: 'read:user',
|
||||
state: 'test-state',
|
||||
codeChallenge: 'test-challenge',
|
||||
codeChallengeMethod: 'S256',
|
||||
}
|
||||
|
||||
export const mockClientInfo = {
|
||||
id: 'test-client-id',
|
||||
name: 'Test Client',
|
||||
description: 'A test OAuth client',
|
||||
logoUrl: 'https://example.com/logo.png',
|
||||
}
|
||||
|
||||
export const mockAccessToken = 'github-access-token-123'
|
||||
export const mockAuthorizationCode = 'auth-code-456'
|
||||
export const mockState = 'oauth-state-789'
|
||||
64
use-cases/mcp-server/tests/fixtures/database.fixtures.ts
vendored
Normal file
64
use-cases/mcp-server/tests/fixtures/database.fixtures.ts
vendored
Normal file
@ -0,0 +1,64 @@
|
||||
export const mockTableColumns = [
|
||||
{
|
||||
table_name: 'users',
|
||||
column_name: 'id',
|
||||
data_type: 'integer',
|
||||
is_nullable: 'NO',
|
||||
column_default: 'nextval(\'users_id_seq\'::regclass)',
|
||||
},
|
||||
{
|
||||
table_name: 'users',
|
||||
column_name: 'name',
|
||||
data_type: 'character varying',
|
||||
is_nullable: 'YES',
|
||||
column_default: null,
|
||||
},
|
||||
{
|
||||
table_name: 'users',
|
||||
column_name: 'email',
|
||||
data_type: 'character varying',
|
||||
is_nullable: 'NO',
|
||||
column_default: null,
|
||||
},
|
||||
{
|
||||
table_name: 'posts',
|
||||
column_name: 'id',
|
||||
data_type: 'integer',
|
||||
is_nullable: 'NO',
|
||||
column_default: 'nextval(\'posts_id_seq\'::regclass)',
|
||||
},
|
||||
{
|
||||
table_name: 'posts',
|
||||
column_name: 'title',
|
||||
data_type: 'text',
|
||||
is_nullable: 'NO',
|
||||
column_default: null,
|
||||
},
|
||||
{
|
||||
table_name: 'posts',
|
||||
column_name: 'user_id',
|
||||
data_type: 'integer',
|
||||
is_nullable: 'NO',
|
||||
column_default: null,
|
||||
},
|
||||
]
|
||||
|
||||
export const mockQueryResult = [
|
||||
{ id: 1, name: 'John Doe', email: 'john@example.com' },
|
||||
{ id: 2, name: 'Jane Smith', email: 'jane@example.com' },
|
||||
]
|
||||
|
||||
export const mockInsertResult = [
|
||||
{ id: 3, name: 'New User', email: 'new@example.com' },
|
||||
]
|
||||
|
||||
export const validSelectQuery = 'SELECT * FROM users WHERE id = 1'
|
||||
export const validInsertQuery = 'INSERT INTO users (name, email) VALUES (\'Test\', \'test@example.com\')'
|
||||
export const validUpdateQuery = 'UPDATE users SET name = \'Updated\' WHERE id = 1'
|
||||
export const validDeleteQuery = 'DELETE FROM users WHERE id = 1'
|
||||
|
||||
export const dangerousDropQuery = 'DROP TABLE users'
|
||||
export const dangerousDeleteAllQuery = 'SELECT * FROM users; DELETE FROM users WHERE 1=1'
|
||||
export const maliciousInjectionQuery = 'SELECT * FROM users; DROP TABLE users; --'
|
||||
export const emptyQuery = ''
|
||||
export const whitespaceQuery = ' '
|
||||
38
use-cases/mcp-server/tests/fixtures/mcp.fixtures.ts
vendored
Normal file
38
use-cases/mcp-server/tests/fixtures/mcp.fixtures.ts
vendored
Normal file
@ -0,0 +1,38 @@
|
||||
import type { McpResponse } from '../../src/types'
|
||||
|
||||
export const mockSuccessResponse: McpResponse = {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: '**Success**\n\nOperation completed successfully',
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
export const mockErrorResponse: McpResponse = {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: '**Error**\n\nSomething went wrong',
|
||||
isError: true,
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
export const mockQueryResponse: McpResponse = {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: '**Query Results**\n```sql\nSELECT * FROM users\n```\n\n**Results:**\n```json\n[\n {\n "id": 1,\n "name": "John Doe"\n }\n]\n```\n\n**Rows returned:** 1',
|
||||
},
|
||||
],
|
||||
}
|
||||
|
||||
export const mockTableListResponse: McpResponse = {
|
||||
content: [
|
||||
{
|
||||
type: 'text',
|
||||
text: '**Database Tables and Schema**\n\n[\n {\n "name": "users",\n "schema": "public",\n "columns": [\n {\n "name": "id",\n "type": "integer",\n "nullable": false,\n "default": "nextval(\'users_id_seq\'::regclass)"\n }\n ]\n }\n]\n\n**Total tables found:** 1\n\n**Note:** Use the `queryDatabase` tool to run SELECT queries, or `executeDatabase` tool for write operations (if you have write access).',
|
||||
},
|
||||
],
|
||||
}
|
||||
54
use-cases/mcp-server/tests/mocks/crypto.mock.ts
Normal file
54
use-cases/mcp-server/tests/mocks/crypto.mock.ts
Normal file
@ -0,0 +1,54 @@
|
||||
import { vi } from 'vitest'
|
||||
|
||||
// Mock crypto.subtle for cookie signing
|
||||
export const mockCryptoSubtle = {
|
||||
sign: vi.fn(),
|
||||
verify: vi.fn(),
|
||||
importKey: vi.fn(),
|
||||
}
|
||||
|
||||
// Mock crypto.getRandomValues
|
||||
export const mockGetRandomValues = vi.fn()
|
||||
|
||||
export function setupCryptoMocks() {
|
||||
// Mock HMAC signing
|
||||
mockCryptoSubtle.sign.mockResolvedValue(new ArrayBuffer(32))
|
||||
|
||||
// Mock signature verification
|
||||
mockCryptoSubtle.verify.mockResolvedValue(true)
|
||||
|
||||
// Mock key import
|
||||
mockCryptoSubtle.importKey.mockResolvedValue({} as CryptoKey)
|
||||
|
||||
// Mock random values
|
||||
mockGetRandomValues.mockImplementation((array: Uint8Array) => {
|
||||
for (let i = 0; i < array.length; i++) {
|
||||
array[i] = Math.floor(Math.random() * 256)
|
||||
}
|
||||
return array
|
||||
})
|
||||
}
|
||||
|
||||
export function setupCryptoError() {
|
||||
mockCryptoSubtle.sign.mockRejectedValue(new Error('Crypto signing failed'))
|
||||
mockCryptoSubtle.verify.mockRejectedValue(new Error('Crypto verification failed'))
|
||||
}
|
||||
|
||||
export function resetCryptoMocks() {
|
||||
vi.clearAllMocks()
|
||||
setupCryptoMocks()
|
||||
}
|
||||
|
||||
// Apply mocks to global crypto object
|
||||
if (!global.crypto) {
|
||||
Object.defineProperty(global, 'crypto', {
|
||||
value: {
|
||||
subtle: mockCryptoSubtle,
|
||||
getRandomValues: mockGetRandomValues,
|
||||
},
|
||||
writable: true,
|
||||
})
|
||||
} else {
|
||||
global.crypto.subtle = mockCryptoSubtle
|
||||
global.crypto.getRandomValues = mockGetRandomValues
|
||||
}
|
||||
57
use-cases/mcp-server/tests/mocks/database.mock.ts
Normal file
57
use-cases/mcp-server/tests/mocks/database.mock.ts
Normal file
@ -0,0 +1,57 @@
|
||||
import { vi } from 'vitest'
|
||||
import { mockTableColumns, mockQueryResult } from '../fixtures/database.fixtures'
|
||||
|
||||
// Mock postgres function
|
||||
export const mockPostgresInstance = {
|
||||
unsafe: vi.fn(),
|
||||
end: vi.fn(),
|
||||
// Template literal query method
|
||||
'`SELECT * FROM users`': vi.fn(),
|
||||
}
|
||||
|
||||
// Mock the postgres module
|
||||
vi.mock('postgres', () => ({
|
||||
default: vi.fn(() => mockPostgresInstance),
|
||||
}))
|
||||
|
||||
// Mock database connection functions
|
||||
vi.mock('../../src/database/connection', () => ({
|
||||
getDb: vi.fn(() => mockPostgresInstance),
|
||||
closeDb: vi.fn(),
|
||||
}))
|
||||
|
||||
// Mock database utils
|
||||
vi.mock('../../src/database/utils', () => ({
|
||||
withDatabase: vi.fn(async (url: string, operation: any) => {
|
||||
return await operation(mockPostgresInstance)
|
||||
}),
|
||||
}))
|
||||
|
||||
// Mock setup functions
|
||||
export function setupDatabaseMocks() {
|
||||
mockPostgresInstance.unsafe.mockImplementation((query: string) => {
|
||||
if (query.includes('information_schema.columns')) {
|
||||
return Promise.resolve(mockTableColumns)
|
||||
}
|
||||
if (query.includes('SELECT')) {
|
||||
return Promise.resolve(mockQueryResult)
|
||||
}
|
||||
if (query.includes('INSERT') || query.includes('UPDATE') || query.includes('DELETE')) {
|
||||
return Promise.resolve([{ affectedRows: 1 }])
|
||||
}
|
||||
return Promise.resolve([])
|
||||
})
|
||||
}
|
||||
|
||||
export function setupDatabaseError() {
|
||||
mockPostgresInstance.unsafe.mockRejectedValue(new Error('Database connection failed'))
|
||||
}
|
||||
|
||||
export function setupDatabaseTimeout() {
|
||||
mockPostgresInstance.unsafe.mockRejectedValue(new Error('Connection timeout'))
|
||||
}
|
||||
|
||||
export function resetDatabaseMocks() {
|
||||
vi.clearAllMocks()
|
||||
setupDatabaseMocks()
|
||||
}
|
||||
59
use-cases/mcp-server/tests/mocks/github.mock.ts
Normal file
59
use-cases/mcp-server/tests/mocks/github.mock.ts
Normal file
@ -0,0 +1,59 @@
|
||||
import { vi } from 'vitest'
|
||||
import { mockGitHubUser, mockAccessToken } from '../fixtures/auth.fixtures'
|
||||
|
||||
// Mock Octokit
|
||||
export const mockOctokit = {
|
||||
rest: {
|
||||
users: {
|
||||
getAuthenticated: vi.fn(),
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
vi.mock('octokit', () => ({
|
||||
Octokit: vi.fn(() => mockOctokit),
|
||||
}))
|
||||
|
||||
// Mock GitHub API responses
|
||||
export function setupGitHubMocks() {
|
||||
mockOctokit.rest.users.getAuthenticated.mockResolvedValue(mockGitHubUser)
|
||||
}
|
||||
|
||||
export function setupGitHubError() {
|
||||
mockOctokit.rest.users.getAuthenticated.mockRejectedValue(new Error('GitHub API error'))
|
||||
}
|
||||
|
||||
export function setupGitHubUnauthorized() {
|
||||
mockOctokit.rest.users.getAuthenticated.mockRejectedValue(new Error('Bad credentials'))
|
||||
}
|
||||
|
||||
export function resetGitHubMocks() {
|
||||
vi.clearAllMocks()
|
||||
setupGitHubMocks()
|
||||
}
|
||||
|
||||
// Mock fetch for GitHub OAuth token exchange
|
||||
export function setupGitHubTokenExchange() {
|
||||
global.fetch = vi.fn((url: string) => {
|
||||
if (url.includes('github.com/login/oauth/access_token')) {
|
||||
return Promise.resolve({
|
||||
ok: true,
|
||||
text: () => Promise.resolve(`access_token=${mockAccessToken}&token_type=bearer&scope=read:user`),
|
||||
} as Response)
|
||||
}
|
||||
return Promise.reject(new Error('Unexpected fetch call'))
|
||||
})
|
||||
}
|
||||
|
||||
export function setupGitHubTokenExchangeError() {
|
||||
global.fetch = vi.fn((url: string) => {
|
||||
if (url.includes('github.com/login/oauth/access_token')) {
|
||||
return Promise.resolve({
|
||||
ok: false,
|
||||
status: 400,
|
||||
text: () => Promise.resolve('error=invalid_grant&error_description=Bad verification code.'),
|
||||
} as Response)
|
||||
}
|
||||
return Promise.reject(new Error('Unexpected fetch call'))
|
||||
})
|
||||
}
|
||||
47
use-cases/mcp-server/tests/mocks/oauth.mock.ts
Normal file
47
use-cases/mcp-server/tests/mocks/oauth.mock.ts
Normal file
@ -0,0 +1,47 @@
|
||||
import { vi } from 'vitest'
|
||||
import { mockAuthRequest, mockClientInfo } from '../fixtures/auth.fixtures'
|
||||
|
||||
// Mock OAuth provider
|
||||
export const mockOAuthProvider = {
|
||||
parseAuthRequest: vi.fn(),
|
||||
lookupClient: vi.fn(),
|
||||
completeAuthorization: vi.fn(),
|
||||
}
|
||||
|
||||
// Mock OAuth helpers
|
||||
export const mockOAuthHelpers = {
|
||||
...mockOAuthProvider,
|
||||
}
|
||||
|
||||
// Mock Cloudflare Workers OAuth Provider
|
||||
vi.mock('@cloudflare/workers-oauth-provider', () => ({
|
||||
default: vi.fn(() => ({
|
||||
fetch: vi.fn(),
|
||||
})),
|
||||
}))
|
||||
|
||||
export function setupOAuthMocks() {
|
||||
mockOAuthProvider.parseAuthRequest.mockResolvedValue(mockAuthRequest)
|
||||
mockOAuthProvider.lookupClient.mockResolvedValue(mockClientInfo)
|
||||
mockOAuthProvider.completeAuthorization.mockResolvedValue({
|
||||
redirectTo: 'http://localhost:3000/callback?code=success',
|
||||
})
|
||||
}
|
||||
|
||||
export function setupOAuthError() {
|
||||
mockOAuthProvider.parseAuthRequest.mockRejectedValue(new Error('Invalid OAuth request'))
|
||||
}
|
||||
|
||||
export function resetOAuthMocks() {
|
||||
vi.clearAllMocks()
|
||||
setupOAuthMocks()
|
||||
}
|
||||
|
||||
// Mock environment with OAuth provider
|
||||
export const mockEnv = {
|
||||
GITHUB_CLIENT_ID: 'test-client-id',
|
||||
GITHUB_CLIENT_SECRET: 'test-client-secret',
|
||||
COOKIE_ENCRYPTION_KEY: 'test-encryption-key',
|
||||
DATABASE_URL: 'postgresql://test:test@localhost:5432/test',
|
||||
OAUTH_PROVIDER: mockOAuthProvider,
|
||||
}
|
||||
20
use-cases/mcp-server/tests/setup.ts
Normal file
20
use-cases/mcp-server/tests/setup.ts
Normal file
@ -0,0 +1,20 @@
|
||||
import { beforeEach, vi } from 'vitest'
|
||||
|
||||
// Mock crypto API for Node.js environment
|
||||
Object.defineProperty(global, 'crypto', {
|
||||
value: {
|
||||
subtle: {
|
||||
sign: vi.fn(),
|
||||
verify: vi.fn(),
|
||||
importKey: vi.fn(),
|
||||
},
|
||||
getRandomValues: vi.fn(),
|
||||
},
|
||||
})
|
||||
|
||||
// Mock fetch globally
|
||||
global.fetch = vi.fn()
|
||||
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
})
|
||||
135
use-cases/mcp-server/tests/unit/database/security.test.ts
Normal file
135
use-cases/mcp-server/tests/unit/database/security.test.ts
Normal file
@ -0,0 +1,135 @@
|
||||
import { describe, it, expect } from 'vitest'
|
||||
import { validateSqlQuery, isWriteOperation, formatDatabaseError } from '../../../src/database/security'
|
||||
import {
|
||||
validSelectQuery,
|
||||
validInsertQuery,
|
||||
validUpdateQuery,
|
||||
validDeleteQuery,
|
||||
dangerousDropQuery,
|
||||
dangerousDeleteAllQuery,
|
||||
maliciousInjectionQuery,
|
||||
emptyQuery,
|
||||
whitespaceQuery,
|
||||
} from '../../fixtures/database.fixtures'
|
||||
|
||||
describe('Database Security', () => {
|
||||
describe('validateSqlQuery', () => {
|
||||
it('should validate safe SELECT queries', () => {
|
||||
const result = validateSqlQuery(validSelectQuery)
|
||||
expect(result.isValid).toBe(true)
|
||||
expect(result.error).toBeUndefined()
|
||||
})
|
||||
|
||||
it('should validate safe INSERT queries', () => {
|
||||
const result = validateSqlQuery(validInsertQuery)
|
||||
expect(result.isValid).toBe(true)
|
||||
expect(result.error).toBeUndefined()
|
||||
})
|
||||
|
||||
it('should reject empty queries', () => {
|
||||
const result = validateSqlQuery(emptyQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('SQL query cannot be empty')
|
||||
})
|
||||
|
||||
it('should reject whitespace-only queries', () => {
|
||||
const result = validateSqlQuery(whitespaceQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('SQL query cannot be empty')
|
||||
})
|
||||
|
||||
it('should reject dangerous DROP queries', () => {
|
||||
const result = validateSqlQuery(dangerousDropQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('Query contains potentially dangerous SQL patterns')
|
||||
})
|
||||
|
||||
it('should reject dangerous DELETE ALL queries', () => {
|
||||
const result = validateSqlQuery(dangerousDeleteAllQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('Query contains potentially dangerous SQL patterns')
|
||||
})
|
||||
|
||||
it('should reject SQL injection attempts', () => {
|
||||
const result = validateSqlQuery(maliciousInjectionQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('Query contains potentially dangerous SQL patterns')
|
||||
})
|
||||
|
||||
it('should handle case-insensitive dangerous patterns', () => {
|
||||
const upperCaseQuery = 'SELECT * FROM users; DROP TABLE users;'
|
||||
const result = validateSqlQuery(upperCaseQuery)
|
||||
expect(result.isValid).toBe(false)
|
||||
expect(result.error).toBe('Query contains potentially dangerous SQL patterns')
|
||||
})
|
||||
})
|
||||
|
||||
describe('isWriteOperation', () => {
|
||||
it('should identify SELECT as read operation', () => {
|
||||
expect(isWriteOperation(validSelectQuery)).toBe(false)
|
||||
})
|
||||
|
||||
it('should identify INSERT as write operation', () => {
|
||||
expect(isWriteOperation(validInsertQuery)).toBe(true)
|
||||
})
|
||||
|
||||
it('should identify UPDATE as write operation', () => {
|
||||
expect(isWriteOperation(validUpdateQuery)).toBe(true)
|
||||
})
|
||||
|
||||
it('should identify DELETE as write operation', () => {
|
||||
expect(isWriteOperation(validDeleteQuery)).toBe(true)
|
||||
})
|
||||
|
||||
it('should identify DROP as write operation', () => {
|
||||
expect(isWriteOperation(dangerousDropQuery)).toBe(true)
|
||||
})
|
||||
|
||||
it('should handle case-insensitive operations', () => {
|
||||
expect(isWriteOperation('insert into users values (1, \'test\')')).toBe(true)
|
||||
expect(isWriteOperation('UPDATE users SET name = \'test\'')).toBe(true)
|
||||
expect(isWriteOperation('Delete from users where id = 1')).toBe(true)
|
||||
})
|
||||
|
||||
it('should handle queries with leading whitespace', () => {
|
||||
expect(isWriteOperation(' INSERT INTO users VALUES (1, \'test\')')).toBe(true)
|
||||
expect(isWriteOperation('\t\nSELECT * FROM users')).toBe(false)
|
||||
})
|
||||
})
|
||||
|
||||
describe('formatDatabaseError', () => {
|
||||
it('should format generic database errors', () => {
|
||||
const error = new Error('Connection failed')
|
||||
const result = formatDatabaseError(error)
|
||||
expect(result).toBe('Database error: Connection failed')
|
||||
})
|
||||
|
||||
it('should sanitize password errors', () => {
|
||||
const error = new Error('authentication failed for user "test" with password "secret123"')
|
||||
const result = formatDatabaseError(error)
|
||||
expect(result).toBe('Database authentication failed. Please check your credentials.')
|
||||
})
|
||||
|
||||
it('should handle timeout errors', () => {
|
||||
const error = new Error('Connection timeout after 30 seconds')
|
||||
const result = formatDatabaseError(error)
|
||||
expect(result).toBe('Database connection timed out. Please try again.')
|
||||
})
|
||||
|
||||
it('should handle connection errors', () => {
|
||||
const error = new Error('Could not connect to database server')
|
||||
const result = formatDatabaseError(error)
|
||||
expect(result).toBe('Unable to connect to database. Please check your connection string.')
|
||||
})
|
||||
|
||||
it('should handle non-Error objects', () => {
|
||||
const result = formatDatabaseError('string error')
|
||||
expect(result).toBe('An unknown database error occurred.')
|
||||
})
|
||||
|
||||
it('should handle null/undefined errors', () => {
|
||||
expect(formatDatabaseError(null)).toBe('An unknown database error occurred.')
|
||||
expect(formatDatabaseError(undefined)).toBe('An unknown database error occurred.')
|
||||
})
|
||||
})
|
||||
})
|
||||
78
use-cases/mcp-server/tests/unit/database/utils.test.ts
Normal file
78
use-cases/mcp-server/tests/unit/database/utils.test.ts
Normal file
@ -0,0 +1,78 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest'
|
||||
|
||||
// Mock the database connection module
|
||||
const mockDbInstance = {
|
||||
unsafe: vi.fn(),
|
||||
end: vi.fn(),
|
||||
}
|
||||
|
||||
vi.mock('../../../src/database/connection', () => ({
|
||||
getDb: vi.fn(() => mockDbInstance),
|
||||
}))
|
||||
|
||||
// Now import the modules
|
||||
import { withDatabase } from '../../../src/database/utils'
|
||||
|
||||
describe('Database Utils', () => {
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
})
|
||||
|
||||
describe('withDatabase', () => {
|
||||
it('should execute database operation successfully', async () => {
|
||||
const mockOperation = vi.fn().mockResolvedValue('success')
|
||||
const result = await withDatabase('test-url', mockOperation)
|
||||
|
||||
expect(result).toBe('success')
|
||||
expect(mockOperation).toHaveBeenCalledWith(mockDbInstance)
|
||||
})
|
||||
|
||||
it('should handle database operation errors', async () => {
|
||||
const mockOperation = vi.fn().mockRejectedValue(new Error('Operation failed'))
|
||||
|
||||
await expect(withDatabase('test-url', mockOperation)).rejects.toThrow('Operation failed')
|
||||
expect(mockOperation).toHaveBeenCalledWith(mockDbInstance)
|
||||
})
|
||||
|
||||
it('should log successful operations', async () => {
|
||||
const consoleSpy = vi.spyOn(console, 'log').mockImplementation(() => {})
|
||||
const mockOperation = vi.fn().mockResolvedValue('success')
|
||||
|
||||
await withDatabase('test-url', mockOperation)
|
||||
|
||||
expect(consoleSpy).toHaveBeenCalledWith(
|
||||
expect.stringMatching(/Database operation completed successfully in \d+ms/)
|
||||
)
|
||||
consoleSpy.mockRestore()
|
||||
})
|
||||
|
||||
it('should log failed operations', async () => {
|
||||
const consoleSpy = vi.spyOn(console, 'error').mockImplementation(() => {})
|
||||
const mockOperation = vi.fn().mockRejectedValue(new Error('Operation failed'))
|
||||
|
||||
await expect(withDatabase('test-url', mockOperation)).rejects.toThrow('Operation failed')
|
||||
|
||||
expect(consoleSpy).toHaveBeenCalledWith(
|
||||
expect.stringMatching(/Database operation failed after \d+ms:/),
|
||||
expect.any(Error)
|
||||
)
|
||||
consoleSpy.mockRestore()
|
||||
})
|
||||
|
||||
it('should measure execution time', async () => {
|
||||
const consoleSpy = vi.spyOn(console, 'log').mockImplementation(() => {})
|
||||
const mockOperation = vi.fn().mockImplementation(async () => {
|
||||
// Simulate some delay
|
||||
await new Promise(resolve => setTimeout(resolve, 10))
|
||||
return 'success'
|
||||
})
|
||||
|
||||
await withDatabase('test-url', mockOperation)
|
||||
|
||||
expect(consoleSpy).toHaveBeenCalledWith(
|
||||
expect.stringMatching(/Database operation completed successfully in \d+ms/)
|
||||
)
|
||||
consoleSpy.mockRestore()
|
||||
})
|
||||
})
|
||||
})
|
||||
257
use-cases/mcp-server/tests/unit/tools/database-tools.test.ts
Normal file
257
use-cases/mcp-server/tests/unit/tools/database-tools.test.ts
Normal file
@ -0,0 +1,257 @@
|
||||
import { describe, it, expect, vi, beforeEach } from 'vitest'
|
||||
|
||||
// Mock the database modules
|
||||
const mockDbInstance = {
|
||||
unsafe: vi.fn(),
|
||||
end: vi.fn(),
|
||||
}
|
||||
|
||||
vi.mock('../../../src/database/connection', () => ({
|
||||
getDb: vi.fn(() => mockDbInstance),
|
||||
}))
|
||||
|
||||
vi.mock('../../../src/database/utils', () => ({
|
||||
withDatabase: vi.fn(async (url: string, operation: any) => {
|
||||
return await operation(mockDbInstance)
|
||||
}),
|
||||
}))
|
||||
|
||||
// Now import the modules
|
||||
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js'
|
||||
import { registerDatabaseTools } from '../../../src/tools/database-tools'
|
||||
import { mockProps, mockPrivilegedProps } from '../../fixtures/auth.fixtures'
|
||||
import { mockEnv } from '../../mocks/oauth.mock'
|
||||
import { mockTableColumns, mockQueryResult } from '../../fixtures/database.fixtures'
|
||||
|
||||
describe('Database Tools', () => {
|
||||
let mockServer: McpServer
|
||||
|
||||
beforeEach(() => {
|
||||
vi.clearAllMocks()
|
||||
mockServer = new McpServer({ name: 'test', version: '1.0.0' })
|
||||
|
||||
// Setup database mocks
|
||||
mockDbInstance.unsafe.mockImplementation((query: string) => {
|
||||
if (query.includes('information_schema.columns')) {
|
||||
return Promise.resolve(mockTableColumns)
|
||||
}
|
||||
if (query.includes('SELECT')) {
|
||||
return Promise.resolve(mockQueryResult)
|
||||
}
|
||||
if (query.includes('INSERT') || query.includes('UPDATE') || query.includes('DELETE')) {
|
||||
return Promise.resolve([{ affectedRows: 1 }])
|
||||
}
|
||||
return Promise.resolve([])
|
||||
})
|
||||
})
|
||||
|
||||
describe('registerDatabaseTools', () => {
|
||||
it('should register listTables and queryDatabase for regular users', () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
expect(toolSpy).toHaveBeenCalledWith(
|
||||
'listTables',
|
||||
expect.any(String),
|
||||
expect.any(Object),
|
||||
expect.any(Function)
|
||||
)
|
||||
expect(toolSpy).toHaveBeenCalledWith(
|
||||
'queryDatabase',
|
||||
expect.any(String),
|
||||
expect.any(Object),
|
||||
expect.any(Function)
|
||||
)
|
||||
expect(toolSpy).toHaveBeenCalledTimes(2)
|
||||
})
|
||||
|
||||
it('should register all tools for privileged users', () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
expect(toolSpy).toHaveBeenCalledWith(
|
||||
'listTables',
|
||||
expect.any(String),
|
||||
expect.any(Object),
|
||||
expect.any(Function)
|
||||
)
|
||||
expect(toolSpy).toHaveBeenCalledWith(
|
||||
'queryDatabase',
|
||||
expect.any(String),
|
||||
expect.any(Object),
|
||||
expect.any(Function)
|
||||
)
|
||||
expect(toolSpy).toHaveBeenCalledWith(
|
||||
'executeDatabase',
|
||||
expect.any(String),
|
||||
expect.any(Object),
|
||||
expect.any(Function)
|
||||
)
|
||||
expect(toolSpy).toHaveBeenCalledTimes(3)
|
||||
})
|
||||
})
|
||||
|
||||
describe('listTables tool', () => {
|
||||
it('should return table schema successfully', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
// Get the registered tool handler
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'listTables')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({})
|
||||
|
||||
expect(result.content).toBeDefined()
|
||||
expect(result.content[0].type).toBe('text')
|
||||
expect(result.content[0].text).toContain('Database Tables and Schema')
|
||||
expect(result.content[0].text).toContain('users')
|
||||
expect(result.content[0].text).toContain('posts')
|
||||
})
|
||||
|
||||
it('should handle database errors', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
mockDbInstance.unsafe.mockRejectedValue(new Error('Database connection failed'))
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'listTables')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({})
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Error')
|
||||
})
|
||||
})
|
||||
|
||||
describe('queryDatabase tool', () => {
|
||||
it('should execute SELECT queries successfully', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'queryDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'SELECT * FROM users' })
|
||||
|
||||
expect(result.content[0].type).toBe('text')
|
||||
expect(result.content[0].text).toContain('Query Results')
|
||||
expect(result.content[0].text).toContain('SELECT * FROM users')
|
||||
})
|
||||
|
||||
it('should reject write operations', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'queryDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'INSERT INTO users VALUES (1, \'test\')' })
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Write operations are not allowed')
|
||||
})
|
||||
|
||||
it('should reject invalid SQL', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'queryDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'SELECT * FROM users; DROP TABLE users' })
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Invalid SQL query')
|
||||
})
|
||||
|
||||
it('should handle database errors', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
mockDbInstance.unsafe.mockRejectedValue(new Error('Database connection failed'))
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'queryDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'SELECT * FROM users' })
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Database query error')
|
||||
})
|
||||
})
|
||||
|
||||
describe('executeDatabase tool', () => {
|
||||
it('should only be available to privileged users', async () => {
|
||||
// Regular user should not get executeDatabase
|
||||
const toolSpy1 = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockProps)
|
||||
|
||||
const executeToolCall = toolSpy1.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
expect(executeToolCall).toBeUndefined()
|
||||
|
||||
// Privileged user should get executeDatabase
|
||||
const mockServer2 = new McpServer({ name: 'test2', version: '1.0.0' })
|
||||
const toolSpy2 = vi.spyOn(mockServer2, 'tool')
|
||||
registerDatabaseTools(mockServer2, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
const privilegedExecuteToolCall = toolSpy2.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
expect(privilegedExecuteToolCall).toBeDefined()
|
||||
})
|
||||
|
||||
it('should execute write operations for privileged users', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'INSERT INTO users VALUES (1, \'test\')' })
|
||||
|
||||
expect(result.content[0].type).toBe('text')
|
||||
expect(result.content[0].text).toContain('Write Operation Executed Successfully')
|
||||
expect(result.content[0].text).toContain('coleam00')
|
||||
})
|
||||
|
||||
it('should execute read operations for privileged users', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'SELECT * FROM users' })
|
||||
|
||||
expect(result.content[0].type).toBe('text')
|
||||
expect(result.content[0].text).toContain('Read Operation Executed Successfully')
|
||||
})
|
||||
|
||||
it('should reject invalid SQL', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'SELECT * FROM users; DROP TABLE users' })
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Invalid SQL statement')
|
||||
})
|
||||
|
||||
it('should handle database errors', async () => {
|
||||
const toolSpy = vi.spyOn(mockServer, 'tool')
|
||||
mockDbInstance.unsafe.mockRejectedValue(new Error('Database connection failed'))
|
||||
registerDatabaseTools(mockServer, mockEnv as any, mockPrivilegedProps)
|
||||
|
||||
const toolCall = toolSpy.mock.calls.find(call => call[0] === 'executeDatabase')
|
||||
const handler = toolCall![3] as Function
|
||||
|
||||
const result = await handler({ sql: 'INSERT INTO users VALUES (1, \'test\')' })
|
||||
|
||||
expect(result.content[0].isError).toBe(true)
|
||||
expect(result.content[0].text).toContain('Database execution error')
|
||||
})
|
||||
})
|
||||
})
|
||||
155
use-cases/mcp-server/tests/unit/utils/response-helpers.test.ts
Normal file
155
use-cases/mcp-server/tests/unit/utils/response-helpers.test.ts
Normal file
@ -0,0 +1,155 @@
|
||||
import { describe, it, expect } from 'vitest'
|
||||
import { createSuccessResponse, createErrorResponse } from '../../../src/types'
|
||||
|
||||
describe('Response Helpers', () => {
|
||||
describe('createSuccessResponse', () => {
|
||||
it('should create success response with message only', () => {
|
||||
const response = createSuccessResponse('Operation completed')
|
||||
|
||||
expect(response.content).toHaveLength(1)
|
||||
expect(response.content[0].type).toBe('text')
|
||||
expect(response.content[0].text).toBe('**Success**\n\nOperation completed')
|
||||
expect(response.content[0].isError).toBeUndefined()
|
||||
})
|
||||
|
||||
it('should create success response with message and data', () => {
|
||||
const data = { id: 1, name: 'Test' }
|
||||
const response = createSuccessResponse('User created', data)
|
||||
|
||||
expect(response.content).toHaveLength(1)
|
||||
expect(response.content[0].type).toBe('text')
|
||||
expect(response.content[0].text).toContain('**Success**')
|
||||
expect(response.content[0].text).toContain('User created')
|
||||
expect(response.content[0].text).toContain('**Result:**')
|
||||
expect(response.content[0].text).toContain(JSON.stringify(data, null, 2))
|
||||
})
|
||||
|
||||
it('should handle null data', () => {
|
||||
const response = createSuccessResponse('Operation completed', null)
|
||||
|
||||
expect(response.content[0].text).toContain('**Success**')
|
||||
expect(response.content[0].text).toContain('Operation completed')
|
||||
expect(response.content[0].text).toContain('**Result:**')
|
||||
expect(response.content[0].text).toContain('null')
|
||||
})
|
||||
|
||||
it('should handle undefined data', () => {
|
||||
const response = createSuccessResponse('Operation completed', undefined)
|
||||
|
||||
expect(response.content[0].text).toBe('**Success**\n\nOperation completed')
|
||||
expect(response.content[0].text).not.toContain('**Result:**')
|
||||
})
|
||||
|
||||
it('should handle complex data objects', () => {
|
||||
const data = {
|
||||
users: [
|
||||
{ id: 1, name: 'Alice' },
|
||||
{ id: 2, name: 'Bob' }
|
||||
],
|
||||
meta: {
|
||||
total: 2,
|
||||
page: 1
|
||||
}
|
||||
}
|
||||
|
||||
const response = createSuccessResponse('Users retrieved', data)
|
||||
|
||||
expect(response.content[0].text).toContain('**Success**')
|
||||
expect(response.content[0].text).toContain('Users retrieved')
|
||||
expect(response.content[0].text).toContain('Alice')
|
||||
expect(response.content[0].text).toContain('Bob')
|
||||
expect(response.content[0].text).toContain('total')
|
||||
})
|
||||
})
|
||||
|
||||
describe('createErrorResponse', () => {
|
||||
it('should create error response with message only', () => {
|
||||
const response = createErrorResponse('Something went wrong')
|
||||
|
||||
expect(response.content).toHaveLength(1)
|
||||
expect(response.content[0].type).toBe('text')
|
||||
expect(response.content[0].text).toBe('**Error**\n\nSomething went wrong')
|
||||
expect(response.content[0].isError).toBe(true)
|
||||
})
|
||||
|
||||
it('should create error response with message and details', () => {
|
||||
const details = { code: 'VALIDATION_ERROR', field: 'email' }
|
||||
const response = createErrorResponse('Validation failed', details)
|
||||
|
||||
expect(response.content).toHaveLength(1)
|
||||
expect(response.content[0].type).toBe('text')
|
||||
expect(response.content[0].text).toContain('**Error**')
|
||||
expect(response.content[0].text).toContain('Validation failed')
|
||||
expect(response.content[0].text).toContain('**Details:**')
|
||||
expect(response.content[0].text).toContain(JSON.stringify(details, null, 2))
|
||||
expect(response.content[0].isError).toBe(true)
|
||||
})
|
||||
|
||||
it('should handle null details', () => {
|
||||
const response = createErrorResponse('Operation failed', null)
|
||||
|
||||
expect(response.content[0].text).toContain('**Error**')
|
||||
expect(response.content[0].text).toContain('Operation failed')
|
||||
expect(response.content[0].text).toContain('**Details:**')
|
||||
expect(response.content[0].text).toContain('null')
|
||||
})
|
||||
|
||||
it('should handle undefined details', () => {
|
||||
const response = createErrorResponse('Operation failed', undefined)
|
||||
|
||||
expect(response.content[0].text).toBe('**Error**\n\nOperation failed')
|
||||
expect(response.content[0].text).not.toContain('**Details:**')
|
||||
})
|
||||
|
||||
it('should handle error objects as details', () => {
|
||||
const error = new Error('Database connection failed')
|
||||
const response = createErrorResponse('Database error', error)
|
||||
|
||||
expect(response.content[0].text).toContain('**Error**')
|
||||
expect(response.content[0].text).toContain('Database error')
|
||||
expect(response.content[0].text).toContain('**Details:**')
|
||||
expect(response.content[0].isError).toBe(true)
|
||||
})
|
||||
|
||||
it('should handle complex error details', () => {
|
||||
const details = {
|
||||
error: 'AUTHENTICATION_FAILED',
|
||||
message: 'Invalid credentials',
|
||||
attempts: 3,
|
||||
nextRetryAt: new Date().toISOString()
|
||||
}
|
||||
|
||||
const response = createErrorResponse('Authentication failed', details)
|
||||
|
||||
expect(response.content[0].text).toContain('AUTHENTICATION_FAILED')
|
||||
expect(response.content[0].text).toContain('Invalid credentials')
|
||||
expect(response.content[0].text).toContain('attempts')
|
||||
expect(response.content[0].isError).toBe(true)
|
||||
})
|
||||
})
|
||||
|
||||
describe('response format consistency', () => {
|
||||
it('should maintain consistent structure across response types', () => {
|
||||
const successResponse = createSuccessResponse('Success message')
|
||||
const errorResponse = createErrorResponse('Error message')
|
||||
|
||||
// Both should have the same structure
|
||||
expect(successResponse.content).toHaveLength(1)
|
||||
expect(errorResponse.content).toHaveLength(1)
|
||||
|
||||
expect(successResponse.content[0].type).toBe('text')
|
||||
expect(errorResponse.content[0].type).toBe('text')
|
||||
|
||||
expect(typeof successResponse.content[0].text).toBe('string')
|
||||
expect(typeof errorResponse.content[0].text).toBe('string')
|
||||
})
|
||||
|
||||
it('should distinguish between success and error responses', () => {
|
||||
const successResponse = createSuccessResponse('Success message')
|
||||
const errorResponse = createErrorResponse('Error message')
|
||||
|
||||
expect(successResponse.content[0].isError).toBeUndefined()
|
||||
expect(errorResponse.content[0].isError).toBe(true)
|
||||
})
|
||||
})
|
||||
})
|
||||
46
use-cases/mcp-server/tsconfig.json
Normal file
46
use-cases/mcp-server/tsconfig.json
Normal file
@ -0,0 +1,46 @@
|
||||
{
|
||||
"compilerOptions": {
|
||||
/* Visit https://aka.ms/tsconfig.json to read more about this file */
|
||||
|
||||
/* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */
|
||||
"target": "es2021",
|
||||
/* Specify a set of bundled library declaration files that describe the target runtime environment. */
|
||||
"lib": ["es2021"],
|
||||
/* Specify what JSX code is generated. */
|
||||
"jsx": "react-jsx",
|
||||
|
||||
/* Specify what module code is generated. */
|
||||
"module": "es2022",
|
||||
/* Specify how TypeScript looks up a file from a given module specifier. */
|
||||
"moduleResolution": "bundler",
|
||||
/* Specify type package names to be included without being referenced in a source file. */
|
||||
"types": [
|
||||
"./worker-configuration.d.ts",
|
||||
"node"
|
||||
],
|
||||
/* Enable importing .json files */
|
||||
"resolveJsonModule": true,
|
||||
|
||||
/* Allow JavaScript files to be a part of your program. Use the `checkJS` option to get errors from these files. */
|
||||
"allowJs": true,
|
||||
/* Enable error reporting in type-checked JavaScript files. */
|
||||
"checkJs": false,
|
||||
|
||||
/* Disable emitting files from a compilation. */
|
||||
"noEmit": true,
|
||||
|
||||
/* Ensure that each file can be safely transpiled without relying on other imports. */
|
||||
"isolatedModules": true,
|
||||
/* Allow 'import x from y' when a module doesn't have a default export. */
|
||||
"allowSyntheticDefaultImports": true,
|
||||
/* Ensure that casing is correct in imports. */
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
|
||||
/* Enable all strict type-checking options. */
|
||||
"strict": true,
|
||||
|
||||
/* Skip type checking all .d.ts files. */
|
||||
"skipLibCheck": true
|
||||
},
|
||||
"exclude": ["example/**/*"]
|
||||
}
|
||||
9
use-cases/mcp-server/vitest.config.js
Normal file
9
use-cases/mcp-server/vitest.config.js
Normal file
@ -0,0 +1,9 @@
|
||||
import { defineConfig } from 'vitest/config'
|
||||
|
||||
export default defineConfig({
|
||||
test: {
|
||||
environment: 'node',
|
||||
globals: true,
|
||||
setupFiles: ['./tests/setup.ts'],
|
||||
},
|
||||
})
|
||||
7365
use-cases/mcp-server/worker-configuration.d.ts
vendored
Normal file
7365
use-cases/mcp-server/worker-configuration.d.ts
vendored
Normal file
File diff suppressed because it is too large
Load Diff
78
use-cases/mcp-server/wrangler.jsonc
Normal file
78
use-cases/mcp-server/wrangler.jsonc
Normal file
@ -0,0 +1,78 @@
|
||||
/**
|
||||
* For more details on how to configure Wrangler, refer to:
|
||||
* https://developers.cloudflare.com/workers/wrangler/configuration/
|
||||
*/
|
||||
{
|
||||
"$schema": "node_modules/wrangler/config-schema.json",
|
||||
"name": "my-mcp-server",
|
||||
"main": "src/index.ts",
|
||||
"compatibility_date": "2025-03-10",
|
||||
"compatibility_flags": [
|
||||
"nodejs_compat"
|
||||
],
|
||||
"migrations": [
|
||||
{
|
||||
"new_sqlite_classes": [
|
||||
"MyMCP"
|
||||
],
|
||||
"tag": "v1"
|
||||
}
|
||||
],
|
||||
"durable_objects": {
|
||||
"bindings": [
|
||||
{
|
||||
"class_name": "MyMCP",
|
||||
"name": "MCP_OBJECT"
|
||||
}
|
||||
]
|
||||
},
|
||||
"kv_namespaces": [
|
||||
{
|
||||
"binding": "OAUTH_KV",
|
||||
"id": "06998ca39ffb4273a10747065041347b"
|
||||
}
|
||||
],
|
||||
"ai": {
|
||||
"binding": "AI"
|
||||
},
|
||||
"observability": {
|
||||
"enabled": true
|
||||
},
|
||||
"dev": {
|
||||
"port": 8792
|
||||
}
|
||||
/**
|
||||
* Smart Placement
|
||||
* Docs: https://developers.cloudflare.com/workers/configuration/smart-placement/#smart-placement
|
||||
*/
|
||||
// "placement": { "mode": "smart" },
|
||||
|
||||
/**
|
||||
* Bindings
|
||||
* Bindings allow your Worker to interact with resources on the Cloudflare Developer Platform, including
|
||||
* databases, object storage, AI inference, real-time communication and more.
|
||||
* https://developers.cloudflare.com/workers/runtime-apis/bindings/
|
||||
*/
|
||||
|
||||
/**
|
||||
* Environment Variables
|
||||
* https://developers.cloudflare.com/workers/wrangler/configuration/#environment-variables
|
||||
*/
|
||||
// "vars": { "MY_VARIABLE": "production_value" },
|
||||
/**
|
||||
* Note: Use secrets to store sensitive data.
|
||||
* https://developers.cloudflare.com/workers/configuration/secrets/
|
||||
*/
|
||||
|
||||
/**
|
||||
* Static Assets
|
||||
* https://developers.cloudflare.com/workers/static-assets/binding/
|
||||
*/
|
||||
// "assets": { "directory": "./public/", "binding": "ASSETS" },
|
||||
|
||||
/**
|
||||
* Service Bindings (communicate between multiple Workers)
|
||||
* https://developers.cloudflare.com/workers/wrangler/configuration/#service-bindings
|
||||
*/
|
||||
// "services": [{ "binding": "MY_SERVICE", "service": "my-service" }]
|
||||
}
|
||||
Loading…
x
Reference in New Issue
Block a user