内容摘录
Nanocoder
A local-first CLI coding agent that brings the power of agentic coding tools like Claude Code and Gemini CLI to local models or controlled APIs like OpenRouter. Built with privacy and control in mind, Nanocoder supports multiple AI providers with tool support for file operations and command execution.
!Example
---
!Build Status
!Coverage
!Version
!NPM Downloads
!NPM License
!Repo Size
!Stars
!Forks
Table of Contents
FAQs
Installation
For Users
For Development
Usage
Interactive Mode
Non-Interactive Mode
Configuration
AI Provider Setup
MCP (Model Context Protocol) Servers
User Preferences
Application Data Directory
Commands
Built-in Commands
Custom Commands
Features
Multi-Provider Support
Advanced Tool System
Custom Command System
Enhanced User Experience
Keyboard Shortcuts
Developer Features
VS Code Extension
Community
FAQs
What is Nanocoder?
Nanocoder is a local-first CLI coding agent that brings the power of agentic coding tools like Claude Code and Gemini CLI to local models or controlled APIs like OpenRouter. Built with privacy and control in mind, Nanocoder supports any AI provider that has an OpenAI compatible end-point, tool and non-tool calling models.
How is this different to OpenCode?
This comes down to philosophy. OpenCode is a great tool, but it's owned and managed by a venture-backed company that restricts community and open-source involvement to the outskirts. With Nanocoder, the focus is on building a true community-led project where anyone can contribute openly and directly. We believe AI is too powerful to be in the hands of big corporations and everyone should have access to it.
We also strongly believe in the "local-first" approach, where your data, models, and processing stay on your machine whenever possible to ensure maximum privacy and user control. Beyond that, we're actively pushing to develop advancements and frameworks for small, local models to be effective at coding locally.
Not everyone will agree with this philosophy, and that's okay. We believe in fostering an inclusive community that's focused on open collaboration and privacy-first AI coding tools.
I want to be involved, how do I start?
Firstly, we would love for you to be involved. You can get started contributing to Nanocoder in several ways, check out the Community section of this README.
Installation
For Users
NPM
Install globally and use anywhere:
Then run in any directory:
Homebrew (macOS/Linux)
First, tap the repository:
Then install:
Run in any directory:
To update:
**Note**: If brew upgrade nanocoder shows the old version is already installed, run brew update first. Homebrew caches tap formulas locally and only refreshes them during brew update. Without updating the tap cache, you'll see the cached (older) version even if a newer formula exists in the repository.
Nix Flakes
Run Nanocoder directly using:
Or install from packages output:
For Development
If you want to contribute or modify Nanocoder:
**Prerequisites:**
Node.js 20+
pnpm
**Setup:**
Clone and install dependencies:
Build the project:
Run locally:
Or build and run in one command:
Usage
CLI Options
Nanocoder supports standard CLI arguments for quick information and help:
**CLI Options Reference:**
| Option | Short | Description |
|--------|-------|-------------|
| --version | -v | Display the installed version number |
| --help | -h | Show usage information and available options |
| --vscode | | Run in VS Code mode (for extension) |
| --vscode-port | | Specify VS Code server port |
| run | | Run in non-interactive mode |
**Common Use Cases:**
Interactive Mode
To start Nanocoder in interactive mode (the default), simply run:
This will open an interactive chat session where you can:
Chat with the AI about your code
Use slash commands (e.g., /help, /model, /status)
Execute bash commands with !
Tag files with @
Review and approve tool executions
Switch between different models and providers
Non-Interactive Mode
For automated tasks, scripting, or CI/CD pipelines, use the run command:
**Examples:**
**Non-interactive mode behavior:**
Automatically executes the given prompt
Runs in auto-accept mode (tools execute without confirmation)
Displays all output and tool execution results
Exits automatically when the task is complete
**Note:** When using non-interactive mode with VS Code integration, place any flags (like --vscode or --vscode-port) before the run command:
Configuration
AI Provider Setup
Nanocoder supports any OpenAI-compatible API through a unified provider configuration.
**Configuration Methods:**
**Interactive Setup (Recommended for new users)**: Run /setup-providers inside Nanocoder for a guided wizard with provider templates. The wizard allows you to:
Choose between project-level or global configuration
Select from common provider templates (Ollama, OpenRouter, LM Studio, Kimi Code, etc.)
Add custom OpenAI-compatible providers manually
Edit or delete existing providers
Fetch available models automatically from your provider
**Manual Configuration**: Create an agents.config.json file (see below for locations)
**Note**: The /setup-providers wizard requires at least one provider to be configured before saving. You cannot exit without adding a provider.
**Configuration File Locations:**
Nanocoder looks for configuration in the following order (first found wins):
**Project-level** (highest priority): agents.config.json in your current working directory
Use this for project-specific providers, models, or API keys
Perfect for team sharing or repository-specific configurations
**User-level (preferred)**: Platform-specific configuration directory
**macOS**: ~/Library/Preferences/nanocoder/agents.config.json
**Linux/Unix**: ~/.config/nanocoder/agents.config.json
**Windows**: %APPDATA%\nanocoder\agents.config.json
Your global default configuration
Used when no project-level config exists
You can override this global configuration directory by setting NANOCODER_CONFIG_DIR. When set, Nanocoder will l…