How to Install MaxClaw: Complete Beginner Guide (2026)

What's up, fellow AI tinkerers — if you've spent any time in the OpenClaw ecosystem lately, you've probably noticed that "how do I actually install this thing without breaking something" is still one of the most Googled questions in the space.

I'm Hanks. Three-plus years testing automation tools in real workflows, not demos. When MaxClaw dropped as MiniMax's official cloud-hosted agent on February 25, 2026, I went through every install path — and I want to save you the hour I lost staring at a warn path missing error.

Here's the honest breakdown.


Quick note on what "MaxClaw" means in 2026: MaxClaw is MiniMax's one-click cloud deployment of OpenClaw. If you want the managed version (zero infrastructure), you skip this whole guide and just hit agent.minimax.io. If you want the self-hosted version — OpenClaw running on your own machine or server — that's what this guide covers. Both run the same underlying framework.


System Requirements

Before you touch a terminal, let's make sure your machine won't fight you. This is where most failed installs start.

macOS

macOS 13 (Ventura) or later is required. Both Intel and Apple Silicon (M1 through M4) are fully supported. The single most important dependency is Node.js — OpenClaw requires Node.js 22.12.0 or newer. If you're on an older Node, the install either fails silently or the gateway crashes under load.

Check what you have:

node --version
# Must return v22.x.x or higher
npm --version
# Must return 9+

Recommended: use nvm (Node Version Manager) so you can switch Node versions without pain:

curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.1/install.sh | bash
# Restart your terminal, then:
nvm install 22
nvm use 22

If you're on macOS and have libvips installed via Homebrew, you may hit a sharp module conflict during install. Fix it with:

SHARP_IGNORE_GLOBAL_LIBVIPS=1 npm install

Windows

Native Windows is not supported. Full stop. The official OpenClaw docs are clear on this: install WSL2 with Ubuntu and follow the Linux path inside it.

# Run in PowerShell as Administrator
wsl --install -d Ubuntu-22.04

Once WSL2 is running, follow the Linux instructions below from inside your Ubuntu shell. Everything else in this guide applies identically.

Linux

Any modern distribution with glibc 2.31+ works — Ubuntu 22.04 LTS, Debian 12, Fedora 39+, Arch. Same Node.js 22+ requirement applies.

Disk space: a fresh install needs roughly 500 MB (including node_modules). Budget at least 5 GB free total once you account for skill packages, logs, and conversation history stored in SQLite under ~/.openclaw/.

RAM baseline: OpenClaw idles at 200–400 MB. If you're only using cloud models (MiniMax M2.5, Claude, OpenAI), a machine with 4–8 GB is fine. Running local models on-device? 16 GB is the safe baseline.

Platform
Supported
Method
macOS 13+ (Intel / Apple Silicon)
✅ Native
Installer script or npm
Linux (Ubuntu 22.04+, Debian 12+, Fedora 39+)
✅ Native
Installer script or npm
Windows (WSL2 + Ubuntu)
✅ Via WSL2
Linux path inside WSL
Native Windows (no WSL)
❌ Not supported
Raspberry Pi 4/5 (4GB+ RAM)
✅ Supported
Linux ARM path

Download & Installation Steps

Two real options here. I'll tell you which one to use based on your situation.

Git Clone Method

This is the path for anyone who wants to stay current with the repo, contribute, or needs the absolute latest build. It requires a bit more manual work.

# Clone the repository
git clone https://github.com/openclaw/openclaw.git
cd openclaw
# Install dependencies
npm install
# Run the onboarding wizard
npm run onboard

The onboarding wizard walks you through: choosing your LLM provider, configuring the gateway port (default 18789), setting up your daemon, and optionally connecting a channel like Telegram or WhatsApp.

One thing I learned the hard way: when the wizard asks "How do you want to hatch your bot?", selecting "Open the Web UI" gives you a dashboard URL you can access in your browser — much easier for first-timers than the terminal UI. Do that.

Desktop Build Option (Recommended for Most Users)

The faster, more beginner-friendly path is the single-line installer script. Per the official install documentation, this handles Node detection, installs the CLI globally, and launches the onboarding wizard automatically:

curl -fsSL https://openclaw.ai/install.sh | bash

Or if you prefer npm directly:

npm install -g openclaw
openclaw onboard

Both pnpm and bun are also supported as package managers if you prefer those.

Security reminder before you proceed: OpenClaw is experimental software. The official docs and multiple security guides (including DigitalOcean's hardening guide) strongly recommend running it on a VPS, VM, or dedicated machine — not your personal laptop where your browser passwords and documents live. Never run it as root.


First Launch & Basic Config

The onboarding wizard covers the basics, but here's what you actually need to configure before your first real task.

Connecting an LLM Provider

OpenClaw connects to your model provider via the openclaw.json config file at ~/.openclaw/openclaw.json. For MiniMax M2.5 (the model that powers MaxClaw's cloud version), here's the working config block:

{
  "env": {
    "MINIMAX_API_KEY": "sk-..."
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "minimax/MiniMax-M2.1"
      }
    }
  },
  "models": {
    "mode": "merge",
    "providers": {
      "minimax": {
        "baseUrl": "https://api.minimax.io/anthropic",
        "apiKey": "${MINIMAX_API_KEY}",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "MiniMax-M2.1",
            "name": "MiniMax M2.1",
            "contextWindow": 200000,
            "maxTokens": 8192
          }
        ]
      }
    }
  }
}

Get your MiniMax API key from platform.minimax.io. The MiniMax Coding Plan gives 300 prompts over 5 hours — useful for initial testing before committing to a paid plan.

Alternatively, if you want to keep Claude Opus 4.5 as primary with MiniMax M2.1 as a fallback:

{
  "agents": {
    "defaults": {
      "models": {
        "anthropic/claude-opus-4-5": { "alias": "opus" },
        "minimax/MiniMax-M2.1": { "alias": "minimax" }
      },
      "model": {
        "primary": "anthropic/claude-opus-4-5",
        "fallbacks": ["minimax/MiniMax-M2.1"]
      }
    }
  }
}

Enabling Memory

Memory is enabled by default in OpenClaw — conversations are stored locally in SQLite at ~/.openclaw/. The context window for MiniMax M2.1 is 200,000 tokens, which covers extensive session history.

To configure the MiniMax OAuth plugin (which lets you authenticate without pasting an API key manually):

openclaw plugins enable minimax-portal-auth
openclaw gateway restart
openclaw onboard --auth-choice minimax

For Telegram integration specifically, MiniMax published a step-by-step Telegram setup guide on February 5, 2026 covering bot token creation via BotFather and the pairing flow in detail. That's the clearest walkthrough I've seen for the channel connection step.

After first launch, always run:

openclaw doctor

This checks your Node version, config file, gateway status, and dependency health. Fix everything it flags before you test any real workflow.


Common Installation Errors

These are the ones I hit, and the ones I see most in community threads.

Error: Node.js 22+ required. Found: v18.x.x You're on an old Node version. Install nvm and switch:

nvm install 22 && nvm use 22

Run node --version to confirm before retrying the install.

warn path missing after install Common on Linux VPS setups. Re-run:

openclaw onboard

This resets the path configuration without wiping your existing setup.

Error: address already in use :::18789 Something else is already occupying the default gateway port. Kill it:

# Linux/macOS
lsof -i :18789
kill -9 <PID>
# Windows (WSL2)
netstat -ano | findstr :18789
taskkill /PID <PID> /F

Or change the gateway port in your openclaw.json.

sharp: Please add node-gyp to your dependencies macOS-specific. Either install Xcode Command Line Tools or set the env var:

SHARP_IGNORE_GLOBAL_LIBVIPS=1 npm install -g openclaw

Gateway connects but bot doesn't respond in Telegram Check your credentials file first:

cat ~/.openclaw/credentials/telegram.json

If it's empty or malformed, re-run the pairing flow with openclaw pairing approve telegram <code>. Also confirm your allowFrom config includes your Telegram ID.

Skill install fails or returns malware warning This is a real risk — the OpenClaw skill ecosystem has had reported malicious packages. Use Cisco's Skill Scanner to vet community skills before installing. Stick to official or well-reviewed skills until you know what you're doing.


At Macaron, we built our agent around exactly this gap — the moment between "I have an idea" and "this actually got done" — so if you want to test a personal AI that handles tasks and remembers your preferences without any local setup or dependency headaches, try Macaron free at macaron.im and run it against something real from your workflow today.


Bookmark this page — refer back to it after each OpenClaw update or reinstall, since dependency requirements shift with each release.

Related Articles:

Hey, I’m Hanks — a workflow tinkerer and AI tool obsessive with over a decade of hands-on experience in automation, SaaS, and content creation. I spend my days testing tools so you don’t have to, breaking down complex processes into simple, actionable steps, and digging into the numbers behind “what actually works.”

Apply to become Macaron's first friends