Skip to main content
FeatureIn Development

Deploy on your terms

Orchestrator is being built for organizations that cannot compromise on data sovereignty. Bring your own integrations, choose your AI provider, and run fully airgapped — with no dependency on external services.

Capability tiers

Full control, at every layer

Three progressive levels of infrastructure ownership — start with your existing tooling and go as deep as a fully isolated deployment.

In Development
Tier 1

Bring Your Own Integrations

Connect the tools your team already uses

Orchestrator connects to the issue trackers, version control systems, and project management tools already in your stack — no forced migrations, no vendor lock-in.

  • Version controlGitLab, GitHub, Bitbucket, and self-hosted instances
  • Issue trackersJira, Linear, GitHub Issues, GitLab Issues
  • CI/CD pipelinesGitLab CI, GitHub Actions, Jenkins, and more
  • Custom integrationsOpen webhook and API layer for internal tooling
Coming Soon
Tier 2

Bring Your Own AI Model

Your data, your model, your choice

Choose the AI provider that fits your organization's compliance posture and cost structure. Use hosted frontier models or run open-source models entirely on your own hardware — Orchestrator adapts to your setup.

  • AnthropicClaude Sonnet, Claude Opus — via API or private deployment
  • OpenAIGPT-4o, o1 — via API or Azure OpenAI Service
  • Self-hosted open-sourceLlama 3, Mistral, Kimi, and compatible GGUF/vLLM models
  • Local inferenceOllama-compatible endpoints for fully on-prem inference
On Roadmap
Tier 3

Full Airgapped Deployment

Zero external dependencies, total isolation

Deploy the entire Orchestrator stack inside your private network with no outbound internet access required. Purpose-built for defence, finance, healthcare, and other regulated industries where data cannot leave the perimeter.

  • No telemetryTelemetry and usage reporting fully disabled by default
  • Offline installHelm chart and container images distributed via private registry
  • On-prem AIPairs with self-hosted model endpoints — no cloud AI calls
  • Audit & complianceFull audit log retained locally, RBAC enforced on all actions

Why it matters

Built for enterprises that can't compromise

Most AI developer tools were designed for individual developers using SaaS. Orchestrator is being built for teams with compliance requirements, data residency constraints, and zero-trust network policies.

Data stays in your perimeter

Source code, task descriptions, and AI prompts never leave your network. No third-party telemetry, no cloud processing of sensitive intellectual property.

Meet your compliance requirements

SOC 2, ISO 27001, HIPAA, FedRAMP — self-hosted deployments give your compliance team full visibility and control over the entire software supply chain.

No AI vendor lock-in

Switch between model providers without changing your workflow. As the AI landscape evolves, your team stays in control of which models power your development process.

Predictable infrastructure costs

Run inference on your own hardware to avoid per-token cloud costs at scale. Right-size your capacity without being subject to external pricing changes.

These features are actively in development

Self-hosting and airgapped capabilities are part of our enterprise roadmap. BYO integrations are furthest along; BYO AI model and full airgapped deployment are coming in subsequent releases. Register your interest below to receive updates and influence prioritization.

Interested in self-hosting?

Tell us about your infrastructure requirements and we'll keep you updated as these capabilities become available. Early registrants help shape the roadmap.