Integrate RoutePlex in under 5 minutes — from zero to your first AI response.
1. Create an Account
Sign up at routeplex.com/auth/signup for a free evaluation plan. No credit card required.
2. Get Your API Key
After signing in, navigate to the API Keys page in your dashboard and create a new key. Keys start with rp_live_ and are shown only once — save it to a secrets manager immediately.
3. Install an SDK
pip install routeplex # Python 3.8+
npm install @routeplex/node # Node 18+4. Make Your First Request
Python
from routeplex import RoutePlex
client = RoutePlex(api_key="rp_live_YOUR_KEY")
# Auto-routing — RoutePlex analyzes your prompt and picks the best model
response = client.chat("Explain quantum computing")
print(response.output)
print(f"Model: {response.model_used}")
print(f"Cost: ${response.usage.cost_usd:.6f}")Node.js
import { RoutePlex } from "@routeplex/node";
const client = new RoutePlex({ apiKey: "rp_live_YOUR_KEY" });
const response = await client.chat("Explain quantum computing");
console.log(response.output);
console.log(`Model: ${response.modelUsed}`);
console.log(`Cost: $${response.usage.costUsd.toFixed(6)}`);curl
curl -X POST https://api.routeplex.com/api/v1/chat \
-H "Authorization: Bearer rp_live_YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{
"messages": [{"role": "user", "content": "Hello!"}],
"mode": "routeplex-ai"
}'That's it. RoutePlex picked the best model for your prompt automatically.
5. Strategy Routing
Override auto-selection with a fixed priority when you know what you want:
response = client.chat("Write a haiku", strategy="speed") # fastest model
response = client.chat("Review this code", strategy="quality") # most capable
response = client.chat("Summarize this article", strategy="cost") # cheapestawait client.chat("Write a haiku", { strategy: "speed" });See Smart Routing for all strategies and when to use each.
6. Manual Mode
Pin a specific model when you need deterministic selection. If it's unavailable, RoutePlex automatically falls back:
response = client.chat("Hello!", model="gpt-4o")await client.chat("Hello!", { model: "gpt-4o" });7. Free Endpoints — no auth needed
# Estimate cost before running a request
estimate = client.estimate("Write a blog post about AI")
print(f"${estimate.estimated_cost_usd:.6f}")
# Enhance a weak prompt
result = client.enhance("tell me about kubernetes")
print(result.enhanced_prompt)
# List all available models
for m in client.list_models():
print(f"{m.id} ({m.provider}) — {m.tier}")OpenAI SDK Compatibility
Already using the OpenAI SDK? Just change the base_url:
from openai import OpenAI
client = OpenAI(
api_key="rp_live_YOUR_KEY",
base_url="https://api.routeplex.com/v1",
)
response = client.chat.completions.create(
model="routeplex-ai",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)Streaming, function calling, and all standard parameters work unchanged.
Next Steps
- Python & Node.js SDKs — Full SDK reference
- Smart Routing — How auto-selection picks models
- Routing Modes — Auto vs manual mode
- Cost Control — Caps, alerts, and budget guards
- Examples — Real-world code patterns
- API Reference — Full endpoint documentation
- Playground — Try models interactively