内容摘录
<div align="center">
jcode
CI
License: MIT
Built with Rust
A blazing-fast, fully autonomous AI coding agent with a gorgeous TUI,
multi-model support, swarm coordination, persistent memory, and 30+ built-in tools -
all running natively in your terminal.
<br>
<img src="https://github.com/1jehuang/jcode/releases/download/v0.3.1/jcode_demo_jaguar.avif" alt="jcode demo" width="800">
<br>
Features · Install · Usage · Architecture · Tools
</div>
---
Features
<div align="center">
| Feature | Description |
|---|---|
| **Blazing Fast TUI** | Sub-millisecond rendering at 1,400+ FPS. No flicker. No lag. Ever. |
| **Multi-Provider** | Claude, OpenAI, GitHub Copilot, OpenRouter - 200+ models, switch on the fly |
| **No API Keys Needed** | Works with your Claude Max, ChatGPT Pro, or GitHub Copilot subscription via OAuth |
| **Persistent Memory** | Learns about you and your codebase across sessions |
| **Swarm Mode** | Multiple agents coordinate in the same repo with conflict detection |
| **30+ Built-in Tools** | File ops, search, web, shell, memory, sub-agents, parallel execution |
| **MCP Support** | Extend with any Model Context Protocol server |
| **Server / Client** | Daemon mode with multi-client attach, session persistence |
| **Sub-Agents** | Delegate tasks to specialized child agents |
| **Self-Updating** | Built-in self-dev mode with hot-reload and canary deploys |
| **Featherweight** | ~28 MB idle client, single native binary - no runtime, no VM, no Electron |
| **OpenClaw** | Always-on ambient agent — gardens memory, does proactive work, responds via Telegram |
</div>
---
<div align="center">
Performance & Resource Efficiency
*A single native binary. No Node.js. No Electron. No Python. Just Rust.*
</div>
jcode is engineered to be absurdly efficient. While other coding agents spin up
Electron windows, Node.js runtimes, and multi-hundred-MB processes, jcode runs
as a single compiled binary that sips resources.
<div align="center">
| Metric | jcode | Typical AI IDE / Agent |
|---|---|---|
| **Idle client memory** | **~28 MB** | 300–800 MB |
| **Server memory** | **~40 MB** (base) | N/A (monolithic) |
| **Active session** | **~50–65 MB** | 500 MB+ |
| **Frame render time** | **0.67 ms** (1,400+ FPS) | 16 ms (60 FPS, if lucky) |
| **Startup time** | **Instant** | 3–10 seconds |
| **CPU at idle** | **~0.3%** | 2–5% |
| **Runtime dependencies** | **None** | Node.js, Python, Electron, … |
| **Binary** | **Single 66 MB executable** | Hundreds of MB + package managers |
</div>
**Real-world proof:** Right now on the dev machine there are **10+ jcode sessions**
running simultaneously - clients, servers, sub-agents - all totaling less memory
than a single Electron app window.
The secret is Rust. No garbage collector pausing your UI. No JS event loop
bottleneck. No interpreted overhead. Just zero-cost abstractions compiled
to native code with jemalloc for memory-efficient long-running sessions.
---
<div align="center">
Installation
Quick Install
macOS via Homebrew
From Source (all platforms)
Then symlink to your PATH:
Prerequisites
You need at least one of:
| Provider | Setup |
|---|---|
| **Claude** (recommended) | Run /login claude inside jcode (opens browser for OAuth) |
| **GitHub Copilot** | Run /login copilot inside jcode (GitHub device flow) |
| **OpenAI** | Run /login openai inside jcode (opens browser for OAuth) |
| **OpenRouter** | Set OPENROUTER_API_KEY=sk-or-v1-... |
| **Direct API Key** | Set ANTHROPIC_API_KEY=sk-ant-... |
Platform Support
| Platform | Status |
|---|---|
| **Linux** x86_64 / aarch64 | Fully supported |
| **macOS** Apple Silicon & Intel | Supported |
| **Windows** x86_64 | Supported (native + WSL2) |
</div>
---
<div align="center">
Usage
</div>
---
<div align="center">
Tools
30+ tools available out of the box - and extensible via MCP.
| Category | Tools | Description |
|---|---|---|
| **File Ops** | read write edit multiedit patch apply_patch | Read, write, and surgically edit files |
| **Search** | glob grep ls codesearch | Find files, search contents, navigate code |
| **Execution** | bash task batch bg | Shell commands, sub-agents, parallel & background execution |
| **Web** | webfetch websearch | Fetch URLs, search the web via DuckDuckGo |
| **Memory** | memory remember session_search conversation_search | Persistent cross-session memory and RAG retrieval |
| **Coordination** | communicate todo_read todo_write | Inter-agent messaging, task tracking |
| **Meta** | mcp skill selfdev | MCP servers, skill loading, self-development |
</div>
---
<div align="center">
Architecture
</div>
<details>
<summary><strong>High-Level Overview</strong></summary>
<br>
**Data Flow:**
User input enters via TUI or CLI
Server routes requests to the appropriate Agent session
Agent sends messages to Provider, receives streaming response
Tool calls are executed via the Registry
Session state is persisted to ~/.jcode/sessions/
</details>
<details>
<summary><strong>Provider System</strong></summary>
<br>
**Key Design:**
MultiProvider detects available credentials at startup
Seamless runtime switching between providers with /model command
Claude direct API with OAuth - no API key needed with a subscription
GitHub Copilot OAuth - access Claude, GPT, Gemini, and more through your Copilot subscription
OpenRouter gives access to 200+ models from all major providers
</details>
<details>
<summary><strong>Tool System</strong></summary>
<br>
**Tool Trait:**
</details>
<details>
<summary><strong>Server & Swarm Coordination</strong></summary>
<br>
**Protocol (newline-delimited JSON over Unix socket):**
**Requests:** Message, Cancel, Subscribe, ResumeSession, CycleModel, SetModel, CommShare, CommMessage, ...
**Events:** TextDelta, ToolStart, ToolResult, TurnComplete, TokenUsage, Notification, SwarmStatus, ...
</details>
<details>
<summary><strong>TUI Rendering</strong></summary>
<br>
**Rendering Performance:**
| Mode | Avg Frame Time | FPS | Memory |
|…