Getting Started

Understand the repos, then start the core runtime.

AtlasClaw gives teams a practical way to build one conversational execution layer across existing enterprise systems.

Repository layout

The three public AtlasClaw repositories.

atlasclaw

Core runtime: API layer, agent engine, session/memory, built-in tools, workflow orchestration, and the main docs.

View on GitHub

atlasclaw-providers

Provider packages, starter patterns, and reference implementations such as Jira and SmartCMP.

View on GitHub

atlasclaw-web

The public website for atlasclaw.ai, built as a static Astro site.

View on GitHub
Shortest path

Minimal local startup flow.

  1. Create a Python virtual environment in the `atlasclaw` repo.
  2. Install the core dependencies from `requirements.txt`.
  3. Point `providers_root` at the external providers repo.
  4. Configure an LLM provider in `atlasclaw.json`.
  5. Start the FastAPI service and open the web UI.
Commands

Core runtime commands.

python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
uvicorn app.atlasclaw.main:app --reload --host 0.0.0.0 --port 8000
atlasclaw.json

The key configuration anchor.

`providers_root` is what connects the core repo to the external provider repo.

{
  "providers_root": "../atlasclaw-providers/providers",
  "model": {
    "primary": "kimi/kimi-k2.5",
    "temperature": 0.7,
    "providers": {
      "kimi": {
        "base_url": "${ANTHROPIC_BASE_URL}",
        "api_key": "${ANTHROPIC_API_KEY}",
        "api_type": "anthropic"
      }
    }
  }
}

Deep reference on GitHub