Skip to content

How It Works

This document describes how the agent-hub repository is structured and how its pieces interact at runtime.

Overview

Agent Hub is a configuration monorepo. It does not run AI models itself. It manages what those models know, what tools they can call, and how secrets reach those tools.

Four layers make up the system:

Layer What it does Key files
Dev environment Hermetic Nix shell, tooling, linting flake.nix, justfile
Agent config Prompts, instructions, agent registry .github/prompts/, .github/instructions/, .github/agents/
MCP gateway Routes tool calls to Docker containers arion-compose.nix, docker/mcp-gateway/, .mcp.json
Secrets Bitwarden vault to .env to containers .env, scripts/bw-env-sync.sh

Dev environment

The Nix flake at flake.nix defines a dev shell with all required tooling: formatters, linters, Arion, Zensical, and the just task runner. direnv activates the shell automatically on directory entry. No global tooling is required or installed.

justfile is the primary entry point. Every operation — formatting, linting, starting the MCP stack, syncing AI configs — runs through just.

Agent configuration

Agent prompts and instructions live under .github/:

.github/
  prompts/
    ezra.prompt.md               # primary agent identity
    subagent-*.prompt.md         # sub-agent overlays
  instructions/
    identity.instructions.md     # enforces agent identity globally
    subagent-*.instructions.md   # sub-agent scoped constraints
  agents/
    *.agent.md                   # OpenCode agent definitions
  skills/
    */SKILL.md                   # domain knowledge skill files

just sync-ai reads these files and regenerates:

  • opencode.jsonOpenCode config projection
  • docker/openwebui/agents.jsonOpenWebUI agent definitions
  • docker/dify/agents.jsonDify agent definitions

VS Code GitHub Copilot reads prompts and instructions directly from .github/ via its built-in agent customization support. No sync step needed for VS Code.

just sync-ai runs automatically inside the Nix shell before every just command (via scripts/sync-ai-contexts.sh). just ai-parity checks that all clients are in sync and runs as part of just lint.

MCP gateway

The MCP gateway is a local Docker stack managed by Arion — a Nix-native Docker Compose wrapper. It runs on the host and exposes tool endpoints that AI clients call via HTTP. Traefik handles routing.

AI client (VS Code / OpenCode / OpenWebUI)
  │  HTTP  localhost:8811/mcp
Traefik (ingress, port 8811)
  ├──► /mcp           mcp-gateway           (unified MCP catalog)
  └──► /mcp/nixos     mcp-nixos-http        (NixOS package and option search)

The unified gateway loads its server list from docker/mcp-gateway/registry/all.yaml. Server definitions — images, secrets, and startup commands — live in docker/mcp-gateway/catalogs/agent-hub.yaml.

.mcp.json is the client-facing config pointing to the unified gateway route. VS Code reads it via .vscode/mcp.json, which is a symlink. OpenCode reads its projection in opencode.json.

just mcp-up starts the baseline stack. Optional bundles extend it (OpenWebUI, llama.cpp, Dify):

Bundle command Adds
just mcp-up-ai OpenWebUI, llama.cpp, whisper.cpp, LLM proxy
just mcp-up-dify Dify, Postgres, Redis, MinIO
just stack-up-all Baseline plus all optional bundles

Secrets

All secrets live in Bitwarden. The local .env file is the working copy and is gitignored.

On first setup, or after vault changes:

just bw-env-hydrate

This pulls all integration secrets from Bitwarden into .env. The Nix dev shell sources .env on startup, and Arion passes the relevant variables into containers at runtime.

See Secrets and Bitwarden for per-service credential workflows.

Sync flow

All configuration derives from committed files. The sync runs automatically but can also be triggered manually:

just sync-ai

What it does:

  1. Renders .mcp.json from the Nix flake app mcp-config
  2. Creates symlinks: GEMINI.md, .github/copilot-instructions.md, .opencode/instructions.md, .opencode/agents, .vscode/mcp.json
  3. Updates opencode.json with current MCP config and instruction paths
  4. Runs scripts/export-ai-platform-artifacts.sh to generate docker/openwebui/agents.json and docker/dify/agents.json

HTTP routes

All HTTP services are fronted by Traefik on port 8811 (override with TRAEFIK_HTTP_PORT in .env).

Path Service
/mcp/* MCP namespace gateway
/openwebui OpenWebUI
/llm Shared OpenAI-compatible LLM proxy
/whisper whisper.cpp speech-to-text
/dify Dify web UI
/dify-api Dify API
/minio MinIO object storage
/minio-console MinIO console
/traefik Traefik dashboard