Zum Inhalt springen
>_<
AI EngineeringWiki

MCP Server

Tools · 5 min

MCP (Model Context Protocol) connects AI assistants like Claude directly to your infrastructure.

What is MCP?

  • Open standard for AI-tool integration
  • Claude can access your servers
  • Read metrics, trigger workflows
  • Self-hosted - no cloud needed

Popular MCP Servers

ServerPurpose
Docker MCPContainer management
Prometheus MCPRead metrics
Grafana MCPDashboards, alerts
Proxmox MCPVM management
n8n MCPTrigger workflows
Ollama MCPLocal LLMs

Setup Example

# claude_desktop_config.json

{
  "mcpServers": {
    "docker": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-docker"]
    },
    "prometheus": {
      "command": "npx", 
      "args": ["-y", "@modelcontextprotocol/server-prometheus"]
    },
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed"]
    }
  }
}

Use Cases

  • "Show me CPU usage of my Docker containers"
  • "Restart the n8n container"
  • "Trigger the backup workflow"
  • "What alerts do we have in Grafana?"

Next step: ship workflows that stay operable

Use proven n8n patterns, templates and integrations for workflows that stay local, documented, and auditable.

Why AI Engineering
  • Local and self-hosted by default
  • Documented and auditable
  • Built from our own runtime
  • Made in Austria
Not legal advice.