Documentation Index
Fetch the complete documentation index at: https://docs.gdilabs.io/llms.txt
Use this file to discover all available pages before exploring further.
What it is
- A platform for AI workforce + human governance, not a chat product.
- Every prompt becomes a Redis-backed job. Every step emits a typed event. Every escalation is logged. Every paid call is cost-tracked.
- The design centre is throughput, audit, and multi-agent governance.
Components
Mother AI— stateless ingest service (Rust/Axum). Authenticates requests, persists projects/workflows, enqueues jobs, streams events.Worker— orchestrator (Python/LangGraph). Pulls jobs, runs the context engine, dispatches through an L1–L4 agent hierarchy, emits typed events.Frontend— dashboard (Next.js / React). Renders live job streams, projects, workflows, and a 3D knowledge atlas.Ingest— knowledge-hub pipeline. Reads markdown, chunks, embeds, upserts into Qdrant.Knowledge Hub MCP— read-only stdio MCP server. Exposes the knowledge hub to any MCP client (Claude Desktop, Claude Code, agents, partner tools).
Core flow
- Client submits a prompt to Mother AI.
- Mother AI authenticates, persists project metadata, enqueues a job in Redis.
- Worker pulls the job, runs the context engine, classifies, dispatches through L1–L4.
- Worker emits typed
AgentEventJSON to a Redis stream. - Frontend (or any client) subscribes via Mother AI’s
GET /v1/jobs/:id/streamand renders backlog + live events. - Worker persists job state, audit trail, and escalation history.
Hierarchy
- L1 — Team Lead (free model): orchestration only — classify, plan, delegate.
- L2 — Managerial roles: Architect, Tech Lead, Release Manager, QA / Security / Adversarial leadership.
- L3 — Acceptance: free-model verifier; pass/fail with deltas, not rewrites.
- L4 — Executor: paid-model file-write specialist; emits net-new files and surgical edits.
Models
- Free: Ollama-served models (local or remote), woken on demand.
- Paid: Claude and OpenAI-compatible endpoints. Health-ranked provider fallback de-prefers failing providers automatically.
Integrate
- Submit a job:
POST /v1/chatreturns ajob_id. Subscribe viaGET /v1/jobs/:id/stream. - Resume an interrupted job:
POST /v1/jobs/:id/resumefor jobs paused on anask_userinterrupt. - Drive retrieval externally: install the
Knowledge Hub MCP serverand add it to Claude Desktop or Claude Code.