Setting Up OpenClaw in an Isolated VM
February 2, 2026
This is the first in a series of posts documenting my experience setting up OpenClaw, an AI assistant platform, in an isolated virtual machine. The goal was to run it with minimal disruption to my host system, with easy teardown if needed.
Goals for this setup:
- Run OpenClaw in an isolated VM
- Minimal disruption to host system, easy teardown
- VM needs internet access
- VM needs access to Ollama on host (port 11434)
- Host needs access to OpenClaw interface (port 18789)
- Discord bot integration
Why Multipass?
I chose Multipass for its minimal host footprint and ease of setup/removal. Other options I considered included VirtualBox, QEMU/KVM, and GNOME Boxes, but Multipass offered the simplest path to a working Ubuntu VM.
Creating the VM
multipass launch --name ai-assistant --cpus 2 --memory 4G --disk 20G 24.04
This creates an Ubuntu 24.04 VM with 2 CPUs, 4GB RAM, and 20GB disk space.
Networking Details
| Item | Value |
|---|---|
| VM IP | 10.119.193.216 |
| Host IP (from VM) | 10.119.193.1 |
| Gateway | Default Multipass bridge |
DNS Fix Required
DNS resolution failed out of the box. I fixed it by configuring Google DNS:
sudo mkdir -p /etc/systemd/resolved.conf.d
echo -e "[Resolve]\nDNS=8.8.8.8 8.8.4.4" | sudo tee /etc/systemd/resolved.conf.d/dns.conf
sudo systemctl restart systemd-resolved
Useful Multipass Commands
multipass shell ai-assistant # Access VM
multipass info ai-assistant # Check VM status
multipass stop ai-assistant # Stop VM
multipass delete ai-assistant && multipass purge # Full cleanup
OpenClaw Configuration
Ollama Provider Setup
To use local Ollama models running on the host, I needed explicit provider
configuration in openclaw.json:
{
"models": {
"providers": {
"ollama": {
"baseUrl": "http://10.119.193.1:11434/v1",
"apiKey": "ollama",
"api": "openai-completions",
"models": [
{
"id": "qwen2.5:32b",
"name": "Qwen 2.5 32B",
"contextWindow": 131072,
"maxTokens": 16384
}
]
}
}
},
"agents": {
"defaults": {
"model": {
"primary": "ollama/qwen2.5:32b"
}
}
}
}
Key discovery: The "api": "openai-completions" field
is required for Ollama integration.
Discord Configuration
{
"channels": {
"discord": {
"enabled": true,
"token": "YOUR_BOT_TOKEN",
"groupPolicy": "allowlist",
"dm": {
"enabled": true,
"policy": "pairing"
},
"guilds": {
"*": {
"requireMention": true,
"channels": {
"general": { "allow": true }
}
}
}
}
}
}
Required Discord Developer Portal settings:
- Enable Message Content Intent
- Enable Server Members Intent (for allowlists)
- OAuth2 scopes:
bot,applications.commands
Gateway Network Binding
To access OpenClaw from the host, set LAN binding:
{
"gateway": {
"bind": "lan",
"port": 18789
}
}
Part 2: Local Models vs Cloud: A Tool-Calling Reality Check
Part 3: Running OpenClaw: Security, Automation & Maintenance