The Doors of Quantara

Speak Friend
& Enter

“The gateway to knowledge opens not by force,
but by a single word spoken in trust.”

Hint: What word unlocks all doors between friends?

AI Hosting Lab

OpenClaw AI

Host your own AI interface, connect any model provider via API, and keep full control of your data. One install, infinite intelligence.

// 01 — Overview

What Is OpenClaw AI?

OpenClaw is a self-hosted AI gateway that lets you connect to any large language model provider through a single unified interface. Run it on your own server, laptop, or cloud VM — your conversations never leave your infrastructure.

🌐

One Interface, Every Model

Switch between Claude, GPT-4o, Gemini, DeepSeek, Llama, Mistral, and more with a single dropdown. No need for separate tabs or subscriptions — just API keys.

🔒

Full Data Sovereignty

Your prompts, conversations, and files stay on your hardware. Nothing is logged by third-party dashboards. API calls go direct to the provider and back to you.

Docker-First Architecture

A single docker run command gets you running in under 60 seconds. Auto-updates, persistent storage, and reverse-proxy ready out of the box.

// 02 — Installation

Quick Start Guide

Get OpenClaw running locally or on a remote server in three steps.

1

Install Docker

Docker is the only prerequisite. Install it on any OS.

Terminal — Install Docker
# macOS $ brew install --cask docker # Ubuntu / Debian $ curl -fsSL https://get.docker.com | sh $ sudo usermod -aG docker $USER # Windows: Download Docker Desktop from docker.com
2

Launch OpenClaw

Pull and run the container. Data persists between restarts.

Terminal — Launch OpenClaw AI
$ docker run -d -p 3000:8080 \ -v openclaw-data:/app/backend/data \ --name openclaw --restart always \ ghcr.io/open-webui/open-webui:main # OpenClaw is now running at: http://localhost:3000 # Create your admin account on first visit
3

Connect Your First AI

Go to Settings → Connections and add any provider API key. Models will auto-populate.

OpenClaw UI — Add Your First Provider
# Navigate to: Settings → Connections # Click "Add New Connection" Name: Anthropic Claude Base URL: https://api.anthropic.com/v1 API Key: sk-ant-your-key-here # Models will auto-populate in the model dropdown # Select a model and start chatting!
+

With Local AI (Ollama)

Run completely offline by connecting to a local Ollama instance. Zero cloud dependency.

Terminal — OpenClaw + Ollama (Fully Local)
# Step 1: Install and start Ollama $ curl -fsSL https://ollama.com/install.sh | sh $ ollama pull llama3.2 # Step 2: Launch OpenClaw with Ollama connection $ docker run -d -p 3000:8080 \ --add-host=host.docker.internal:host-gateway \ -e OLLAMA_BASE_URL=http://host.docker.internal:11434 \ -v openclaw-data:/app/backend/data \ --name openclaw --restart always \ ghcr.io/open-webui/open-webui:main # All conversations stay 100% on your machine

// 03 — Connect Any AI Provider

Supported Providers

OpenClaw supports the OpenAI-compatible API format. Any provider that uses this format works out of the box. Here's how to connect each one.

C

Anthropic Claude

Claude Opus, Sonnet, and Haiku. Get your key from console.anthropic.com

Claude Setup
Base URL: https://api.anthropic.com/v1 API Key: sk-ant-api03-xxxxx # Models: claude-opus-4-6, claude-sonnet-4-5, claude-haiku-4-5
O

OpenAI (GPT-4o, o1)

Native support. Get your key from platform.openai.com

OpenAI Setup
Base URL: https://api.openai.com/v1 API Key: sk-proj-xxxxx # Models: gpt-4o, gpt-4-turbo, o1-preview, gpt-4o-mini
G

Google Gemini

Free tier available. Get your key from aistudio.google.com

Gemini Setup
Base URL: https://generativelanguage.googleapis.com/v1beta/openai API Key: AIzaSyXXXXXX # Models: gemini-2.0-flash, gemini-1.5-pro
D

DeepSeek (R1, V3)

Ultra-cheap reasoning models. Key from platform.deepseek.com

DeepSeek Setup
Base URL: https://api.deepseek.com/v1 API Key: sk-xxxxx # Models: deepseek-reasoner, deepseek-chat
G

Groq (Ultra-Fast)

Lightning-fast inference on LPU chips. Free tier at console.groq.com

Groq Setup
Base URL: https://api.groq.com/openai/v1 API Key: gsk_xxxxx # Models: llama-3.3-70b, mixtral-8x7b, gemma2-9b
M

Mistral AI

European AI. Key from console.mistral.ai

Mistral Setup
Base URL: https://api.mistral.ai/v1 # Models: mistral-large, codestral, mistral-small
P

Perplexity (Search AI)

AI with real-time web search. Key from perplexity.ai

Perplexity Setup
Base URL: https://api.perplexity.ai # Models: sonar-pro, sonar, sonar-reasoning-pro

// 04 — Advanced Configuration

Production Hosting

Deploy OpenClaw to a cloud server and access it from anywhere, securely.

Cloud Server Deployment

Host on any VPS (Hetzner, DigitalOcean, Linode, AWS) with a custom domain and SSL.

Server — Nginx Reverse Proxy + SSL
# Install Nginx and Certbot $ sudo apt install nginx certbot python3-certbot-nginx # Create Nginx config: /etc/nginx/sites-available/openclaw server { server_name ai.yourdomain.com; location / { proxy_pass http://localhost:3000; proxy_set_header Host $host; proxy_set_header X-Real-IP $remote_addr; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection "upgrade"; } } # Enable site and get SSL certificate $ sudo ln -s /etc/nginx/sites-available/openclaw /etc/nginx/sites-enabled/ $ sudo certbot --nginx -d ai.yourdomain.com $ sudo systemctl reload nginx
🔄

Docker Compose (Multi-Service)

Run OpenClaw with Ollama and auto-updates in a single compose file.

docker-compose.yml
version: "3.8" services: ollama: image: ollama/ollama:latest ports: ["11434:11434"] volumes: [ollama:/root/.ollama] restart: always openclaw: image: ghcr.io/open-webui/open-webui:main ports: ["3000:8080"] environment: - OLLAMA_BASE_URL=http://ollama:11434 volumes: [openclaw-data:/app/backend/data] depends_on: [ollama] restart: always volumes: ollama: openclaw-data: # Launch: docker compose up -d
💡 Pro Tip — Multi-Provider Strategy

Add all your provider keys at once. Use a fast local model (Llama 3.2 7B) for quick questions, Claude Sonnet for complex reasoning, and GPT-4o for image analysis. OpenClaw lets you switch with a single dropdown — every conversation stays in one place.