Skip to content

Architecture

HELIX Mission Control is a containerized application with six services communicating over an internal Docker network.

System Overview

                  ┌──────────────────────────────────────────┐
                  │              Docker Network              │
                  │                                          │
  User ──HTTPS──► │  Caddy ──► Frontend (Next.js :3000)     │
                  │    │                                     │
                  │    └────► Backend (FastAPI :8000)        │
                  │              │           │               │
                  │              ▼           ▼               │
                  │         PostgreSQL    Redis              │
                  │           (:5432)    (:6379)             │
                  │              │                           │
                  │              │    WebSocket              │
                  │    Backend ◄─┼────────────► Gateway      │
                  │              │            (:18789)       │
                  │              │               │           │
                  │              │          AI Provider      │
                  │              │         (external)        │
                  └──────────────────────────────────────────┘

Services

Caddy (Reverse Proxy)

  • Port: 80 (HTTP) → 443 (HTTPS)
  • Role: TLS termination, reverse proxy, automatic Let's Encrypt certificates
  • Routes / to the Frontend, /api/ to the Backend
  • Optional — can be skipped if using an external proxy (SKIP_PROXY=true)

Frontend (Next.js 14)

  • Port: 3000 (internal)
  • Stack: Next.js 14 with App Router, TypeScript, Tailwind CSS
  • Role: Dashboard UI — login, Kanban boards, agent management, settings
  • Server-side rendering for initial page load, client-side navigation thereafter
  • Calls the Backend API via NEXT_PUBLIC_API_URL

Backend (FastAPI)

  • Port: 8000 (internal)
  • Stack: Python 3.12, FastAPI, SQLAlchemy (async), Alembic
  • Role: REST API, business logic, authentication, database operations
  • Manages WebSocket connection to the Gateway for agent execution
  • JWT authentication with bcrypt password hashing

Gateway (OpenClaw)

  • Port: 18789 (internal, WebSocket)
  • Role: AI agent orchestration engine
  • Manages agent workspaces, executes tasks against AI models
  • WebSocket protocol with JSON-RPC-style messages
  • Handles agent identity (SOUL.md, MEMORY.md, skills)
  • Optional Telegram bot integration

PostgreSQL 16

  • Port: 5432 (internal)
  • Role: Primary data store
  • Stores users, organizations, departments, boards, agents, tasks, comments, skills, notifications, activity logs
  • Persistent volume: pgdata
  • Health check: pg_isready

Redis 7

  • Port: 6379 (internal)
  • Role: Caching and session management
  • Used by the gateway for agent session state
  • Health check: redis-cli ping

Data Flow

Task Execution

1. User creates task ──► Backend API
2. Backend saves task ──► PostgreSQL
3. User clicks Execute (or auto-dispatch)
4. Backend sends task ──► Gateway (WebSocket)
5. Gateway builds prompt:
   - SOUL.md (system prompt)
   - MEMORY.md (learned rules)
   - Active skills
   - Task description + comments
6. Gateway calls AI provider ──► External API
7. AI model returns result
8. Gateway sends result ──► Backend (WebSocket event)
9. Backend updates task ──► PostgreSQL
10. Frontend polls/receives update ──► User sees result

Comment with @Mention

1. User posts comment with @AgentName
2. Backend resolves mentions ──► identifies agent
3. Backend saves comment ──► PostgreSQL
4. Backend triggers agent ──► Gateway
5. Agent reads comment context + task history
6. Agent responds ──► new comment created
7. Notification sent to user

Database Schema

Core Tables

TablePurpose
organizationsMulti-tenant organization data
usersUser accounts with bcrypt passwords
departmentsOrganizational departments
boardsKanban boards within departments
agentsAI agent configurations
tasksTask data with status tracking
commentsTask comment threads
activity_logsAudit trail of all actions
notificationsUser notification queue

Agent Intelligence Tables

TablePurpose
ai_modelsBYOK model configurations
skillsCustom skill documents
agent_skillsAgent-to-skill assignments
skill_attachmentsFiles attached to skills

System Tables

TablePurpose
gatewaysGateway connection configurations
organization_settingsPer-org settings
board_permissionsBoard-level access control
token_usageAI token consumption tracking
onboarding_stateNew org setup progress
task_attachmentsFiles attached to tasks

File Storage

/home/helix/
├── helix-mission-control/
│   ├── .env                        # Configuration
│   ├── docker-compose.yml          # Service definitions
│   ├── backups/                    # Database backups
│   └── data/
│       └── uploads/                # User file uploads

└── .openclaw/
    ├── workspaces/                 # Agent workspaces
    │   └── {agent-id}/
    │       ├── SOUL.md             # Agent identity
    │       ├── MEMORY.md           # Learned rules
    │       └── memory/             # Session notes
    ├── identity/                   # Gateway identity
    └── configs/                    # Gateway configs

Network

All services communicate over the helix-network Docker bridge network. Only Caddy (ports 80/443) is exposed to the host. All other services are internal.

yaml
networks:
  helix-network:
    driver: bridge

Volumes

VolumeServicePurpose
pgdataPostgreSQLDatabase files
uploads-dataBackendUser uploads
caddy-dataCaddySSL certificates
caddy-configCaddyCaddy configuration

Built by HelixNode