Integrations

Using AltLLM with LLM gateways, proxies, and other tools.

Configuration Checklist

AltLLM works with LLM gateway and proxy tools that support OpenAI-compatible providers. When integrating through a gateway:

RequirementDetails
API formatopenai-completions (chat completions endpoint)
Base URLhttps://api.altllm.ai/v1
AuthBearer token in Authorization header
ToolsSupported, but keep schemas minimal for best performance
StreamingSupported via SSE (stream: true)

OpenClaw

{
  "models": {
    "providers": {
      "altllm": {
        "baseUrl": "https://api.altllm.ai/v1",
        "apiKey": "YOUR_API_KEY",
        "api": "openai-completions",
        "models": [
          { "id": "altllm-basic", "name": "AltLLM Basic" },
          { "id": "altllm-standard", "name": "AltLLM Standard" },
          { "id": "altllm-mega", "name": "AltLLM Mega" }
        ]
      }
    }
  }
}

LiteLLM

model_list:
  - model_name: altllm-basic
    litellm_params:
      model: openai/altllm-basic
      api_base: https://api.altllm.ai/v1
      api_key: YOUR_API_KEY

curl

curl https://api.altllm.ai/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "altllm-basic",
    "messages": [{"role": "user", "content": "Hello"}],
    "stream": true
  }'