Deployment Guide¶
This guide covers deploying mcp-scan with all advanced detectors enabled.
Quick Start¶
1. Verify Dependencies¶
2. Install Missing Dependencies¶
# Language servers
npm install -g pyright typescript typescript-language-server
go install golang.org/x/tools/gopls@latest
# CodeQL (macOS)
brew install codeql
# ML training dependencies
pip install datasets scikit-learn numpy
3. Configure LLM Provider¶
Choose one:
# Option A: Ollama (local)
ollama serve &
ollama pull llama3.2:3b
# Option B: Claude
export ANTHROPIC_API_KEY=sk-ant-...
# Option C: OpenAI
export OPENAI_API_KEY=sk-...
4. Train ML Model (Optional)¶
5. Run Scan¶
Docker Deployment¶
Using Docker Compose¶
Create docker-compose.yml:
version: '3.8'
services:
mcp-scan:
build:
context: .
dockerfile: Dockerfile
volumes:
- ./:/workspace
- mcp-scan-cache:/root/.cache/mcp-scan
environment:
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
- MCP_SCAN_LLM_URL=http://ollama:11434
depends_on:
- ollama
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama-data:/root/.ollama
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: 1
capabilities: [gpu]
lsp-servers:
build:
context: ./docker
dockerfile: lsp.Dockerfile
volumes:
- ./:/workspace
volumes:
ollama-data:
mcp-scan-cache:
Build and Run¶
# Build images
docker-compose build
# Start services
docker-compose up -d
# Run scan
docker-compose exec mcp-scan mcp-scan scan /workspace --llm
CI/CD Integration¶
GitHub Actions¶
name: MCP Security Scan
on:
push:
branches: [main]
pull_request:
jobs:
scan:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup Go
uses: actions/setup-go@v5
with:
go-version: '1.22'
- name: Install mcp-scan
run: go install github.com/mcphub/mcp-scan/cmd/mcp-scan@latest
- name: Run scan
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
run: |
mcp-scan scan . \
--llm --llm-provider claude \
--ml \
--output sarif \
--fail-on high \
> results.sarif
- name: Upload SARIF
uses: github/codeql-action/upload-sarif@v3
with:
sarif_file: results.sarif
GitLab CI¶
mcp-scan:
image: golang:1.22
stage: test
script:
- go install github.com/mcphub/mcp-scan/cmd/mcp-scan@latest
- mcp-scan scan . --llm --ml --fail-on high
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
Production Configuration¶
Recommended Settings¶
Create .mcp-scan.yaml:
version: "1"
mode: deep
timeout: 30m
fail_on: high
workers: 0 # Auto-detect CPU count
include:
- "**/*.py"
- "**/*.ts"
- "**/*.js"
- "**/*.go"
exclude:
- "node_modules/**"
- "venv/**"
- ".venv/**"
- "vendor/**"
- "dist/**"
- "build/**"
- "**/*.test.*"
- "**/*_test.*"
llm:
enabled: true
provider: "claude" # Or use auto-detect
model: "claude-3-haiku-20240307"
threshold: 0.7
timeout: "60s"
max_length: 5000
codeql:
enabled: true
timeout: "30m"
queries_dir: "./codeql-queries/mcp"
min_severity: 5.0
languages: ["python", "javascript", "go"]
lsp:
enabled: true
languages: ["python", "typescript", "go"]
ml:
enabled: true
threshold: 0.3
model_path: "./ml_weights.json"
output:
format: sarif
include_trace: true
redact_snippets: false
Environment Variables¶
# Required for Claude
export ANTHROPIC_API_KEY=sk-ant-...
# Required for OpenAI
export OPENAI_API_KEY=sk-...
# Optional overrides
export MCP_SCAN_LLM_PROVIDER=claude
export MCP_SCAN_LLM_THRESHOLD=0.8
export MCP_SCAN_CODEQL_TIMEOUT=1h
Monitoring and Observability¶
Logging¶
mcp-scan outputs structured logs:
# JSON logs
mcp-scan scan . --llm 2>&1 | jq .
# Filter by level
mcp-scan scan . --llm 2>&1 | jq 'select(.level == "error")'
Metrics¶
Track scan metrics:
| Metric | Description |
|---|---|
scan_duration_seconds |
Total scan time |
files_scanned |
Number of files analyzed |
findings_count |
Total vulnerabilities found |
findings_by_severity |
Breakdown by severity |
llm_requests |
LLM API calls made |
llm_latency_ms |
LLM response time |
Healthcheck¶
# Check dependencies
./scripts/check-lsp.sh
# Verify LLM connectivity
curl -s http://localhost:11434/api/tags | jq .
# Test scan
mcp-scan scan ./testdata/fixtures --llm --output json | jq '.summary'
Troubleshooting¶
LLM Not Available¶
Error: LLM detector not enabled
Solution:
1. Check provider is running: curl http://localhost:11434/api/tags
2. Verify API key is set: echo $ANTHROPIC_API_KEY
3. Check network connectivity
CodeQL Timeout¶
Error: CodeQL analysis timed out
Solution:
1. Increase timeout: --codeql-timeout 1h
2. Limit languages: --codeql-languages python
3. Use smaller codebase
LSP Server Not Found¶
Error: pyright-langserver not found
Solution:
1. Install: npm install -g pyright
2. Check PATH: which pyright-langserver
3. Use Docker image with pre-installed servers