NullClaw supports 50+ AI providers through a unified vtable-based interface. This enables seamless model switching without changing your agent configuration.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/nullclaw/nullclaw/llms.txt
Use this file to discover all available pages before exploring further.
Provider Interface
All providers implement theProvider vtable interface defined in src/providers/root.zig:
Core Providers
NullClaw includes dedicated implementations for these major providers:- OpenAI — GPT-4o, GPT-5, o1, o3, o4
- Anthropic — Claude 4, Sonnet, Opus
- OpenRouter — 200+ model aggregator
- Ollama — Local LLMs (Llama, Mistral, Qwen)
- Gemini — Google Gemini 2.0, 1.5 Pro
- Claude CLI — Reuses
~/.claude/credentials - Codex CLI — GitHub Copilot integration
- OpenAI Codex — Legacy Codex models
Compatible Providers (41)
These providers use the OpenAI-compatible API format:Major Cloud Providers
- Groq, Mistral, DeepSeek, xAI, Cerebras, Perplexity, Cohere
Gateways & Aggregators
- Venice, Vercel AI Gateway, Together AI, Fireworks AI, Hugging Face
- AIHubMix, SiliconFlow, Chutes, Synthetic, Poe
China Providers
- Moonshot (Kimi), GLM (Zhipu), Z.AI, MiniMax, Qwen, Qianfan, Doubao
Infrastructure
- Amazon Bedrock, Cloudflare AI Gateway, GitHub Copilot, NVIDIA NIM, OVHcloud
Local Servers
- LM Studio, vLLM, llama.cpp, SGLang, Osaurus, LiteLLM
Provider Selection
Select a provider in~/.nullclaw/config.json:
Custom Endpoints
Usecustom: prefix for arbitrary OpenAI-compatible endpoints:
anthropic-custom: for Anthropic-format endpoints:
Capabilities
| Capability | Description |
|---|---|
supportsNativeTools() | Function calling / tool use |
supportsStreaming() | SSE streaming responses |
supportsVision() | Image/multimodal input |
warmup() | Pre-warm TLS connections |
Provider-Specific Notes
- OpenAI: Supports reasoning models (o1, o3, gpt-5) with
reasoning_effortparameter - Anthropic: OAuth tokens (
sk-ant-oat01-) use Bearer auth instead of x-api-key - Gemini: Supports API keys, OAuth, and Gemini CLI (
~/.gemini/oauth_creds.json) - Ollama: No authentication required for local servers
- OpenRouter: Requires
HTTP-RefererandX-Titleheaders