Pricing
Open core. No surprises.
Anchor is free and open source under Apache 2.0. Enterprise adds a dedicated quota, SLA, and direct support from the team.
Anchor core is fully open source — every feature in the Free tier is on GitHub today, with no usage cap on self-hosted deployments.
Apache 2.0
Open Source
Freeforever
The full Anchor runtime. Self-host in 5 minutes with Docker Compose. No credit card, no usage limits, no lock-in.
- OpenAI-compatible proxy (FastAPI + LiteLLM)
- Persistent sessions via Valkey Streams
- Exact replay engine — swap models on any past run
- Zero-cost simulate mode
- NVIDIA NIM routing with PII detection
- Anomaly detection: cycle, runaway, cost spike, tool loop
- Prometheus metrics + Grafana dashboards
- OpenTelemetry spans (GenAI semantic conventions)
- Local Postgres API key management
- Admin dashboard
- Community support (GitHub Issues)
Talk to us
Enterprise
Customdiscuss quota
For teams that need a dedicated quota, SLA guarantees, and a support line directly to the Anchor team.
- Everything in Open Source
- Custom step quota tailored to your workload
- Dedicated deployment support & onboarding
- SLA with guaranteed response times
- Smart bypass — circuit breaker + provider failover
- LLM anomaly reflection (GPT-4 root-cause diagnosis)
- Semantic context retrieval over session history
- Per-account routing rules (DB-backed REST API)
- SSO / Supabase auth
- Multi-tenant billing integration
- Direct access to the Anchor engineering team
No SDK changes required. Works with LangChain, CrewAI, AutoGen, LangGraph, and any custom agent.
Questions about enterprise quota or deployment? Email enterprise@maximlabs.co