System Architecture
Akmatori uses a secure 4-container architecture with network isolation to provide safe, scalable AI-powered incident automation.
Component Overview
Supported LLM Providers
Akmatori uses pi-mono as its unified LLM runtime, giving you the flexibility to choose any provider or run models on your own infrastructure.
OpenAI
GPT-5.4, GPT-5.3 Codex, GPT-5.2 Codex
Anthropic
Claude Opus 4.6, Claude Sonnet 4.6, Claude Haiku 4.5
Gemini 2.5 Pro, Gemini 2.5 Flash
OpenRouter
Access to 200+ models from all providers
Custom/On-Prem
Any OpenAI-compatible endpoint
Security Design
Key Security Features
Agent Worker never sees database credentials
LLM API keys passed via WebSocket for each task
Three isolated Docker networks
API (UID 1000) and Agent (UID 1001) for file permission control
Docker Services
Main Go backend: incident management, skill orchestration, WebSocket server for Agent Worker
PostgreSQL database storing incidents, skills, tools, and encrypted credentials
Model Context Protocol gateway: fetches credentials from DB, executes SSH/Zabbix operations
Agent Worker: runs pi-mono for multi-provider LLM inference (OpenAI, Anthropic, Google, OpenRouter, custom). Isolated, no DB access.
Network Isolation
Akmatori uses three separate Docker networks to ensure security through isolation:
frontend
External access for the UI and API proxy
api-internal
API β Database, MCP Gateway β Database connections
codex-network
Isolated network for Agent Worker β MCP Gateway
How It Works
Akmatori uses pi-coding-agent as a multi-provider LLM runtime in an isolated container to execute AI-powered automation tasks. When an alert is received or a skill is triggered:
Alert Normalization
API container extracts key fields using source-specific adapters
Incident Creation
Records context, creates workspace with skill files and symlinks
Task Dispatch
API sends task + LLM credentials to Agent Worker via WebSocket
AI Execution
Agent Worker runs pi-mono in the incident workspace
Tool Calls
When the agent needs SSH/Zabbix access, MCP tools call the MCP Gateway
Credential Fetch
MCP Gateway retrieves credentials from database and executes the operation
Result Streaming
Output streams back through WebSocket to API for real-time updates
Completion
Results posted to Slack (if configured) and incident status updated