Integrations
Using AltLLM with LLM gateways, proxies, and other tools.
Configuration Checklist
AltLLM works with LLM gateway and proxy tools that support OpenAI-compatible providers. When integrating through a gateway:
| Requirement | Details |
|---|---|
| API format | openai-completions (chat completions endpoint) |
| Base URL | https://api.altllm.ai/v1 |
| Auth | Bearer token in Authorization header |
| Tools | Supported, but keep schemas minimal for best performance |
| Streaming | Supported via SSE (stream: true) |
OpenClaw
{
"models": {
"providers": {
"altllm": {
"baseUrl": "https://api.altllm.ai/v1",
"apiKey": "YOUR_API_KEY",
"api": "openai-completions",
"models": [
{ "id": "altllm-basic", "name": "AltLLM Basic" },
{ "id": "altllm-standard", "name": "AltLLM Standard" },
{ "id": "altllm-mega", "name": "AltLLM Mega" }
]
}
}
}
}LiteLLM
model_list:
- model_name: altllm-basic
litellm_params:
model: openai/altllm-basic
api_base: https://api.altllm.ai/v1
api_key: YOUR_API_KEYcurl
curl https://api.altllm.ai/v1/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "altllm-basic",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'