Frequently Asked Questions
Common questions about Omnipotent, answered.
Is Omnipotent free?
Yes. The core runtime is free and open-source. Bring your own LLM API keys (or use local model fallback). Pro and Scale tiers are planned for the future, adding managed memory, priority support, and advanced federation features. The Mana budget system tracks token costs so you always know what you are spending.
Do I need API keys?
API keys are optional. If you provide them (Claude, Gemini, Codex, Grok), the agents will use those models for generation and analysis. Without keys, the system falls back to local LLM support where available. Set your keys as environment variables before booting.
What LLM models are supported?
Currently supported models:
- Claude (Anthropic) — Opus, Sonnet, Haiku
- Gemini (Google) — Pro, Flash
- Codex (OpenAI) — GPT-4, o-series
- Grok (xAI) — via API
The omnipotent switch-model command (planned) will allow runtime model switching per agent. Each agent can run on a different model provider.
How much RAM do I need?
Minimum: 5 GB RAM. This runs the Minimal mode with a reduced container set (Kernel + Bus + Soul Journal).
Recommended: 12 GB+ RAM. This unlocks the full stack including graph services (Neo4j, Graphiti), all 12 agents, and generous pgvector indexing.
Disk: ~2 GB for Docker images. Cold Memory storage grows over time based on session history.
Does my data ever leave my machine?
Only if you explicitly push it through the Airlock. The Diplomat agent syncs Alliance (team) data via git, but it has no access to your Sovereign (private) plans. All services (Redis, pgvector, Neo4j) run on the Docker internal network with no published ports by default. The only host-exposed port is the Kernel Dashboard.
Is Omnipotent an operating system?
No. Omnipotent is an Agentic Runtime — a living workflow modeled after OS architecture. It runs on top of your existing OS using Docker containers. Think of it as a team of AI specialists that live on your hardware, with their own nervous system (Signal Bus), memory (Mnemosyne), and coordination protocols (Swarm).
How is this different from Copilot, Cursor, or Windsurf?
Those are tools — they live in your editor with no persistent memory. Omnipotent is a runtime with 12 specialized agent roles, tiered memory that compounds over time, and self-correcting protocols (including the Void Mirror retrospective). It does not assist you — it works alongside you.
Can I use it with my team?
Yes. The Federation Protocol lets you share plans and collaborate through a git-based Commons repo, while keeping private work strictly local. Branch locking and worktree isolation prevent conflicts. See the Federation page for setup instructions.
What platforms are supported?
macOS (ARM64 and Intel), Linux (x86_64), and Windows via WSL2. Native Windows is not supported — use WSL2 with Docker Desktop. See the Releases page for the full platform matrix.
How do I report bugs or request features?
Open an issue on the GitHub Issues page. Include your OS, Docker version, and the output of omnipotent status for bug reports.

