Skip to main content

Documentation Index

Fetch the complete documentation index at: https://mintlify.com/nullclaw/nullclaw/llms.txt

Use this file to discover all available pages before exploring further.

NullClaw Logo

NullClaw

Null overhead. Null compromise. 100% Zig. 100% Agnostic. The smallest fully autonomous AI assistant infrastructure — a static Zig binary that fits on any $5 board, boots in milliseconds, and requires nothing but libc.
678 KB binary · <2 ms startup · 3,230+ tests · 22+ providers · 18 channels · Pluggable everything

What is NullClaw?

NullClaw is a complete AI assistant runtime written entirely in Zig, designed to run anywhere with minimal resources. From edge devices to servers, NullClaw delivers the full AI assistant stack in a binary smaller than most JavaScript frameworks.

Quick Start

Get up and running in under 5 minutes with our quick start guide

Installation

Detailed installation instructions and build optimization options

Core Concepts

Learn about NullClaw’s vtable architecture and design principles

API Reference

Comprehensive API documentation and examples

Key Features

Impossibly Small

678 KB static binary — no runtime, no VM, no framework overhead. The entire AI assistant infrastructure fits in less space than a single image file.
NullClaw is smaller than most JavaScript frameworks while providing complete AI assistant functionality including multi-provider support, memory systems, and tool execution.

Near-Zero Memory

~1 MB peak RSS. Runs comfortably on the cheapest ARM single-board computers and microcontrollers. No garbage collector, no allocator overhead.

Instant Startup

  • <2 ms on Apple Silicon
  • <8 ms on a 0.8 GHz edge core
Boot times measured from process start to ready state. No warm-up period required.

True Portability

Single self-contained binary across ARM, x86, and RISC-V. Drop it anywhere, it just runs. No dependencies beyond libc.

Feature-Complete

Despite its tiny size, NullClaw includes:
  • 22+ AI Providers: OpenRouter, Anthropic, OpenAI, Ollama, Venice, Groq, Mistral, xAI, DeepSeek, Together, Fireworks, Perplexity, Cohere, AWS Bedrock, and more
  • 18 Channels: CLI, Telegram, Signal, Discord, Slack, iMessage, Matrix, WhatsApp, Webhook, IRC, and more
  • 18+ Tools: shell, file operations, memory systems, browser control, HTTP requests, hardware access, and more
  • Hybrid Memory: Vector + FTS5 keyword search with SQLite backend
  • Multi-layer Sandbox: Landlock, Firejail, Bubblewrap, Docker support
  • Hardware Peripherals: Arduino, Raspberry Pi GPIO, STM32/Nucleo
  • MCP Support: Model Context Protocol for external tools
  • Streaming & Voice: Real-time streaming responses and voice transcription

Why NullClaw?

Lean by Default

Zig compiles to a tiny static binary with zero overhead. No allocator waste, no garbage collector pauses, no runtime dependencies.

Secure by Design

Comprehensive security model:
  • Pairing-based authentication
  • Strict sandboxing (landlock, firejail, bubblewrap, docker)
  • Explicit allowlists for commands and paths
  • Workspace scoping
  • Encrypted secrets with ChaCha20-Poly1305
By default, NullClaw operates in supervised mode with workspace-only access. All privileged operations require explicit configuration.

Fully Swappable

Core systems are vtable interfaces. Every major component is pluggable:
  • Providers: Swap AI models with a config change
  • Channels: Add new messaging platforms without touching core code
  • Tools: Extend capabilities through the tool interface
  • Memory: Choose between SQLite, markdown, or custom backends
  • Tunnels: Cloudflare, Tailscale, ngrok, or custom solutions
  • Peripherals: Support for any hardware interface
  • Runtimes: Native, Docker, or WASM execution

No Lock-In

OpenAI-compatible provider support plus pluggable custom endpoints. Bring your own models, run local inference, or use any cloud provider.

Benchmark Comparison

Local machine benchmark (macOS arm64, Feb 2026), normalized for 0.8 GHz edge hardware.
OpenClawNanoBotPicoClawZeroClawNullClaw
LanguageTypeScriptPythonGoRustZig
RAM> 1 GB> 100 MB< 10 MB< 5 MB~1 MB
Startup (0.8 GHz)> 500 s> 30 s< 1 s< 10 ms< 8 ms
Binary Size~28 MBN/A~8 MB3.4 MB678 KB
Tests1,0173,230+
Source Files~400+~120~110
CostMac Mini $599Linux SBC ~$50Board $10Any $10Any $5
Measured with /usr/bin/time -l on ReleaseSmall builds. NullClaw is a static binary with zero runtime dependencies.

Reproduce Benchmarks

zig build -Doptimize=ReleaseSmall
ls -lh zig-out/bin/nullclaw

/usr/bin/time -l zig-out/bin/nullclaw --help
/usr/bin/time -l zig-out/bin/nullclaw status

Architecture Overview

Every subsystem in NullClaw is a vtable interface — swap implementations with a config change, zero code modifications required.
SubsystemInterfaceDefault Implementation
AI ModelsProvider22+ providers (OpenRouter, Anthropic, OpenAI, etc.)
ChannelsChannel18 channels (CLI, Telegram, Discord, etc.)
MemoryMemorySQLite with hybrid vector + FTS5 search
ToolsTool18+ tools (shell, files, memory, browser, etc.)
SecuritySandboxAuto-detect (Landlock, Firejail, Bubblewrap, Docker)
RuntimeRuntimeAdapterNative, Docker, WASM
TunnelsTunnelCloudflare, Tailscale, ngrok, custom
PeripheralsPeripheralSerial, Arduino, RPi GPIO, STM32

Project Stats

Language:     Zig 0.15.2
Source files: ~110
Lines of code: ~45,000
Tests:        3,230+
Binary:       678 KB (ReleaseSmall)
Peak RSS:     ~1 MB
Startup:      <2 ms (Apple Silicon)
Dependencies: 0 (besides libc + optional SQLite)

Next Steps

Quick Start Guide

Build and run your first NullClaw agent in minutes

Configuration

Learn how to configure providers, channels, and tools

Security Model

Understand NullClaw’s security architecture

Development

Contribute to NullClaw or extend it with custom providers

Community

NullClaw is MIT licensed and fully open source. Join the community:
NullClaw — Null overhead. Null compromise. Deploy anywhere. Swap anything.