astrolabe.run + app shell launch
Astrolabe Cloud is a hosted control plane for fast onboarding, managed access requests, and future key issuance. The router is designed to improve model economics while keeping compatibility and observability first.
Minimal integration changes Keep your OpenAI-compatible client flow. Update base URL and credential path, then let Astrolabe route based on policy.
No infrastructure to bootstrap. OpenAI-compatible endpoint semantics. Routing metadata surfaced through response headers.agent.ts
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.astrolabe.run/v1",
apiKey: process.env.ASTROLABE_API_KEY,
});
const result = await client.chat.completions.create({
model: "ignored-by-router",
messages: [{ role: "user", content: "Summarize this issue." }],
}); One account for model access Astrolabe Cloud gives your team one managed integration path while preserving OpenAI-compatible request shape.
Policy-aware cost controls Requests can be routed to less expensive tiers when complexity is low and escalated when confidence drops.
Open-source routing core Astrolabe remains transparent. Core routing behavior stays inspectable and self-hostable from the public repo.
Astrolabe Cloud Phase 1: access workflow and app shell.