Skip to content
/ otto Public

A powerful AI-powered development assistant CLI that brings intelligent agents and tools directly to your terminal. Build, plan, and execute development tasks with AI assistance from multiple providers.

License

Notifications You must be signed in to change notification settings

nitishxyz/otto

Repository files navigation

otto

AI-powered coding assistant. CLI, desktop app, embeddable server, ACP agent — one tool, multiple interfaces.

Version License AI SDK Bun SST


What is otto?

otto is an AI coding assistant that runs locally. It connects to AI providers (Anthropic, OpenAI, Google, etc.), gives the model access to your filesystem via built-in tools (read, write, bash, git, ripgrep, etc.), and streams responses back to you.

It ships as:

  • CLI — run otto in your terminal for interactive or one-shot usage
  • Server + Web UI — run otto serve to get a local HTTP API and browser interface
  • Desktop App — Tauri app that embeds the CLI binary and web UI
  • Embeddable SDK — use @ottocode/server and @ottocode/sdk in your own apps

The CLI binary is self-contained — it bundles the server, database, web UI, and tools into a single executable built with bun build --compile.


Install

curl -fsSL https://install.ottocode.io | sh

Or via npm/bun:

bun install -g @ottocode/install

This downloads the prebuilt binary for your platform (macOS arm64/x64, Linux arm64/x64) and puts it in ~/.local/bin.


Usage

otto                           # start server + web UI (opens browser)
otto --no-desktop              # skip desktop app and serve
otto "explain this error"      # one-shot question
otto "write tests" --agent build
otto "follow up" --last        # continue last session
otto serve                     # start server without desktop check
otto serve --port 3000         # custom port
otto serve --network           # bind to 0.0.0.0 for LAN access

When you run otto with no arguments, it checks for the desktop app first. If installed, it opens it. Otherwise it starts the local server and opens the web UI in your browser.

Other Commands

otto setup                     # interactive provider setup
otto auth login                # configure provider credentials
otto auth list                 # list configured providers
otto sessions                  # browse session history
otto models                    # list available models
otto agents                    # list/configure agents
otto tools                     # list available tools
otto mcp list                  # list MCP servers
otto mcp add <name>            # add an MCP server
otto mcp auth <name>           # authenticate OAuth server
otto doctor                    # check configuration
otto share <session-id>        # share a session publicly
otto upgrade                   # upgrade to latest version
otto scaffold                  # generate agents, tools, or commands

Providers

otto supports multiple AI providers via AI SDK v6:

Provider Models Auth
Anthropic Claude 4.5 Sonnet, Claude Sonnet 4, Claude Opus API key
OpenAI GPT-4o, GPT-4o-mini, o1, Codex Mini API key
Google Gemini 2.5 Pro, Gemini 2.0 Flash API key
OpenRouter 100+ models API key
OpenCode Free-tier Anthropic access OAuth
Setu OpenAI/Anthropic proxy with Solana USDC payments Solana wallet
Moonshot Moonshot AI models API key
otto "refactor this" --provider anthropic --model claude-sonnet-4
otto "explain generics" --provider openai --model gpt-4o

Environment Variables

export ANTHROPIC_API_KEY="sk-ant-..."
export OPENAI_API_KEY="sk-..."
export GOOGLE_GENERATIVE_AI_API_KEY="..."
export OPENROUTER_API_KEY="sk-or-..."
export SETU_PRIVATE_KEY="..."           # Solana wallet (base58)

Agents

Four built-in agents, each with a curated toolset:

Agent Purpose Key Tools
build Code generation, bug fixes, features read, write, bash, git, terminal, apply_patch, ripgrep, websearch
plan Architecture planning, analysis read, ls, tree, ripgrep, update_todos, websearch
general Mixed tasks, conversational read, write, bash, ripgrep, glob, websearch, update_todos
research Deep research across sessions read, ripgrep, websearch, query_sessions, search_history

All agents also get: progress_update, finish, skill.

otto "create auth component" --agent build
otto "design API architecture" --agent plan
otto "research how this works" --agent research

Agents are configurable per-project (.otto/agents.json) or globally (~/.config/otto/agents.json).


Tools

15+ built-in tools:

Category Tools
File System read, write, ls, tree, glob
Search grep, ripgrep, websearch
Editing edit, apply_patch
Shell bash, terminal
Git git_status, git_diff, git_commit
Agent progress_update, finish, update_todos, skill

MCP (Model Context Protocol)

otto supports MCP servers — the open standard for connecting AI agents to external tools and data sources. Connect to local or remote MCP servers to extend your agent's capabilities.

Quick Start

From the Web UI: Open the MCP panel (sidebar) → click + → add a server.

From the CLI:

# Add a local (stdio) server
otto mcp add helius --command npx --args -y helius-mcp@latest

# Add a remote (HTTP) server
otto mcp add linear --transport http --url https://mcp.linear.app/mcp

# List configured servers
otto mcp list

# Authenticate with an OAuth server
otto mcp auth linear

From config (.otto/config.json):

{
  "mcp": {
    "servers": [
      {
        "name": "github",
        "command": "npx",
        "args": ["-y", "@modelcontextprotocol/server-github"],
        "env": { "GITHUB_TOKEN": "${GITHUB_TOKEN}" }
      },
      {
        "name": "linear",
        "transport": "http",
        "url": "https://mcp.linear.app/mcp"
      }
    ]
  }
}

MCP tools appear as servername__toolname (e.g., helius__getBalance) and are automatically available to all agents. Start servers from the sidebar panel — they stay alive for the duration of the session.

Supported Transports

Transport Use Case Example
stdio Local servers (npx, node, python) npx -y helius-mcp@latest
HTTP Remote servers (recommended) https://mcp.linear.app/mcp
SSE Remote servers (legacy) https://mcp.asana.com/sse

OAuth

Remote servers like Linear, Notion, and Sentry use OAuth. otto handles the full flow automatically:

  1. Click Start on a remote server → browser opens for authorization
  2. Authorize in the browser → otto receives the callback
  3. Server reconnects with tokens → tools become available

Tokens are stored securely in ~/.config/otto/oauth/ and refresh automatically.


Custom Tools

Add project-specific tools in .otto/tools/:

// .otto/tools/deploy.ts
import { tool } from "@ottocode/sdk";
import { z } from "zod";

export default tool({
  name: "deploy",
  description: "Deploy to production",
  parameters: z.object({
    environment: z.enum(["staging", "production"]),
  }),
  execute: async ({ environment }) => {
    return { success: true, url: "https://app.example.com" };
  },
});

Configuration

File Locations

~/.config/otto/
├── auth.json            # API keys (0600 permissions)
└── config.json          # Global defaults

.otto/                    # Project-specific
├── config.json          # Project config
├── agents.json          # Agent overrides
├── agents/              # Custom agent prompts
├── commands/            # Custom CLI commands
├── tools/               # Custom tools
└── otto.sqlite           # Local session database

Project Config

{
  "defaults": {
    "provider": "anthropic",
    "model": "claude-sonnet-4",
    "agent": "build"
  }
}

Priority: CLI flags > Environment variables > Project .otto/ > Global ~/.config/otto/ > Defaults


Architecture

Bun workspace monorepo. Infrastructure managed with SST.

Apps

App Description Stack
apps/cli Main CLI binary Commander, compiles to single binary via bun build --compile
apps/web Web UI (client for the server) React 19, Vite, TanStack Router/Query, Tailwind, Zustand
apps/desktop Desktop app (embeds CLI binary + web UI) Tauri v2, React
apps/setu AI provider proxy with Solana payments Hono, Cloudflare Worker
apps/preview-api Session sharing API Hono, Cloudflare Worker + D1
apps/preview-web Public session viewer Astro, AWS

Packages

Package Description
@ottocode/sdk Core SDK: tools, agents, auth, config, providers, prompts. Tree-shakable.
@ottocode/server HTTP API server (Hono): routes, SSE streaming, agent runtime
@ottocode/database SQLite + Drizzle ORM for local persistence
@ottocode/api Type-safe API client (generated from OpenAPI spec)
@ottocode/web-sdk React components, hooks, stores for building web UIs
@ottocode/web-ui Pre-built static web UI assets (embedded in CLI binary)
@ottocode/install npm installer package (downloads binary on postinstall)

Dependency Graph

Level 0 (no deps)    install, api, web-ui
Level 1              sdk (auth, config, providers, prompts, tools)
Level 2              database (depends on sdk for paths)
Level 3              server (depends on sdk, database)
Level 4              web-sdk (depends on api, sdk)
Level 5              cli (depends on sdk, server, database)

Infrastructure (SST)

All infra is defined as code with SST, deploying to AWS and Cloudflare:

Resource Platform Domain
Setu proxy Cloudflare Worker setu.ottocode.io
Preview API Cloudflare Worker + D1 api.share.ottocode.io
Preview Web AWS (Astro SSR) share.ottocode.io
Install Script Cloudflare Worker install.ottocode.io
OG Image AWS Lambda (function URL)
bun sst dev                    # local dev with live infra
bun sst deploy --stage prod    # deploy to production

Embedding

Use otto as a library in your own applications:

import { createEmbeddedApp } from "@ottocode/server";

const app = createEmbeddedApp({
  provider: "anthropic",
  model: "claude-sonnet-4",
  apiKey: process.env.ANTHROPIC_API_KEY,
  agent: "build",
});

Bun.serve({
  port: 9100,
  fetch: app.fetch,
  idleTimeout: 240,
});

Or use the SDK directly:

import { generateText, resolveModel, discoverProjectTools } from "@ottocode/sdk";

const model = await resolveModel("anthropic", "claude-sonnet-4");
const tools = await discoverProjectTools(process.cwd());

const result = await generateText({
  model,
  prompt: "List all TypeScript files and count lines",
  tools: Object.fromEntries(tools.map((t) => [t.name, t.tool])),
  maxSteps: 10,
});

See Embedding Guide for full details including custom agents, multi-provider auth, web UI serving, and CORS configuration.


Development

Prerequisites

Setup

git clone https://github.com/nitishxyz/otto.git
cd otto
bun install

Commands

bun run cli ask "hello"        # run CLI from source
bun test                       # run tests (bun:test)
bun lint                       # lint (Biome)
bun run typecheck              # type check all packages
bun run compile                # build standalone binary

Dev Servers

bun run dev:cli                # CLI dev mode
bun run dev:web                # Web UI (Vite dev server)
bun run dev:desktop            # Desktop app (Tauri)
bun sst dev                    # SST dev (setu, preview-api, preview-web)

Cross-Compilation

bun run build:bin:darwin-arm64
bun run build:bin:darwin-x64
bun run build:bin:linux-x64
bun run build:bin:linux-arm64

Database

bun run db:generate            # generate Drizzle migrations
bun run db:reset               # reset local database

Tech Stack

Layer Technology
Runtime Bun
AI AI SDK v6
Server Hono
Database SQLite + Drizzle ORM
Web UI React 19, Vite, TanStack, Tailwind CSS, Zustand
Desktop Tauri v2
Infrastructure SST (AWS + Cloudflare)
Linting Biome
Testing bun:test

Docs

Document Description
Getting Started Installation and first steps
Usage Guide Commands and workflows
Configuration Settings reference
Agents & Tools Built-in agents and tools
MCP Servers Connect to external MCP servers
Architecture Monorepo structure, packages, infra
Development Guide Dev workflows for all components
Embedding Guide Embed otto in your apps
API Reference REST endpoints and SSE
Troubleshooting Common issues
All Docs Full documentation index

Contributing

See AGENTS.md for conventions.

  • Bun for everything (no npm/yarn/pnpm)
  • Biome for linting (bun lint)
  • bun:test for tests
  • TypeScript strict mode
  • Conventional commits (feat:, fix:, docs:, etc.)

License

MIT


GitHub · Issues · npm

About

A powerful AI-powered development assistant CLI that brings intelligent agents and tools directly to your terminal. Build, plan, and execute development tasks with AI assistance from multiple providers.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •