NullClaw supports 50+ AI providers through a unified configuration interface. Configure provider credentials, default models, and temperature settings.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/nullclaw/nullclaw/llms.txt
Use this file to discover all available pages before exploring further.
Basic Model Configuration
Provider Setup
Configure providers in themodels.providers section:
Map of provider name to provider configuration. Each key is a provider identifier (e.g.,
openrouter, openai, anthropic).API key for the provider. Can also be set via environment variables like
OPENROUTER_API_KEY.Optional custom base URL for API requests. Useful for proxies or compatible providers.
Whether the provider supports native OpenAI-style tool calls. Set to
false to use XML tool format via system prompt.Default Model Selection
Configure the default model used by agents:Primary model identifier. Format:
provider/model-name or just model-name if using the default provider.Default provider when model identifier doesn’t specify one.
Legacy fallback model name (prefer
agents.defaults.model.primary).Temperature Configuration
Control model creativity and randomness:Sampling temperature (0.0 to 1.0). Lower values are more deterministic, higher values more creative.
Optional reasoning effort level for compatible models (e.g.,
low, medium, high).Multiple Provider Configuration
You can configure multiple providers for fallback and redundancy:Ordered list of fallback provider names to try if the primary fails.
Number of retry attempts per provider before moving to fallback.
Milliseconds to wait between retry attempts (exponential backoff).
Supported Providers
NullClaw supports these providers out of the box:- openrouter — Multi-provider router (recommended)
- openai — GPT-4, GPT-3.5, and embedding models
- anthropic — Claude 3/4 family
- google — Gemini models
- mistral — Mistral AI models
- groq — Fast inference for open models
- ollama — Local model hosting
- cohere — Cohere models
- together — Together AI models
- And 40+ more compatible providers
Environment Variable Overrides
You can set API keys via environment variables instead of config file:Example: Local Model with Ollama
Run models locally using Ollama:No API key is required for Ollama. Just ensure the Ollama server is running locally.