con Terminal: The AI-Native Terminal We've Been Waiting For
Most AI terminals are chat panels stapled onto existing emulators. con is different — it's a native terminal emulator built from scratch in Rust with AI woven into its architecture from day one.
Built by nowledge-co on top of Ghostty's terminal runtime and Zed's GPUI framework, con treats AI agents as first-class citizens. The result is something that feels like the terminal your brain always wanted — not a chatbot pretending to be one.
We've been running it for a week. Here's what we found.
What Is con?
con is an open-source, GPU-accelerated terminal emulator with a built-in AI harness. It's not a plugin. Not a wrapper. Not a VS Code terminal tab. It's a standalone native app that gives you a real terminal experience — SSH, tmux, splits, tabs — with an AI agent that sees what you see and can act on it.
"con was initially inspired by warp.dev, but is doing less than warp — if you need more, you should go for warp instead."
That line from the README tells you everything about the philosophy. Terminal first. AI when it earns its place.
The Architecture: Why It Works
Most AI terminals bolt an LLM onto an existing shell. con builds the shell around the agent. Here's how:
1. Ghostty Runtime (Not a Webview)
con embeds Ghostty via its C API. Ghostty owns the PTY lifecycle, VT parsing, scrollback, and rendering. This means you get Ghostty's performance and correctness — no Electron, no webview hacks, no DOM pretending to be a terminal. Each split pane gets its own Ghostty surface, managed by con's GPUI shell.
2. GPUI Shell (Zed's GPU Framework)
The UI layer uses GPUI — the same framework that powers Zed editor. This gives you Zed-level text rendering quality, GPU-accelerated everything, and native performance. The terminal chrome (tabs, splits, command palette, agent panel) renders at native speed because it's using a real GPU pipeline, not HTML.
3. Agent-First Design
The AI isn't a sidebar chat. con has a shared AgentHarness plus per-tab AgentSession objects. The agent has real tools: terminal_exec, file read/write, search, list — and it gets terminal context extraction for free. It sees your shell output, your file system, your running processes.
4. Pane Runtime Tracker
This is the killer feature. con doesn't just show a terminal — it understands what's running in it. The pane runtime tracker models the full stack explicitly:
- Local shell
- SSH connection → remote shell
- tmux session → nested shell
- Agent CLIs, vim, htop, or any foreground process
It doesn't guess from window titles. If con can't prove what's running, it says unknown. This is how runtime observability should work.
The AI Harness: 13 Providers, Real Tools
con uses Rig 0.34 for its agent framework. This means it supports 13 providers out of the box — not locked to OpenAI, not an Anthropic wrapper. You pick your model, configure your API key, and go.
The agent tools are terminal-native, not web-native:
| Tool | What It Does |
|---|---|
terminal_exec | Execute commands in the active terminal |
shell | Run shell commands and capture output |
file | Read and write files with context |
edit | Targeted file edits (not full rewrites) |
list | List directory contents with metadata |
search | Search across files with pattern matching |
These aren't abstract web tools. They operate on your actual terminal, your actual files, your actual shell. The agent can read your git output, understand your error messages, and fix things — all inside the terminal you're already looking at.
Platform Support
| Platform | Status | Notes |
|---|---|---|
| macOS | Fully supported (beta) | Homebrew + DMG + curl install |
| Linux | Preview | libghostty-vt backend, KWin blur support |
| Windows | Early beta | In active development, tracker #34 |
Installation is a one-liner on all platforms:
macOS / Linux: curl -fsSL https://con-releases.nowledge.co/install.sh | sh macOS (Homebrew): brew install --cask nowledge-co/tap/con-beta Windows: irm https://con-releases.nowledge.co/install.ps1 | iex
What Makes It Different
The terminal space is crowded. iTerm2, Alacritty, Kitty, WezTerm, Ghostty, Warp, Wave Terminal — why does con matter?
Because con isn't trying to replace your terminal. It's trying to replace the gap between your terminal and your AI tools. Here's the distinction:
vs. Warp
Warp is an AI-first terminal with a proprietary backend. It's opinionated, polished, and cloud-connected. con is the anti-Warp: open source, offline-first, and the AI stays local unless you configure otherwise. The devs explicitly say "if you need more [than con], go for warp."
vs. Wave Terminal
Wave has AI blocks and an integrated shell. con's approach is deeper — the AI has terminal-level context, not just block-level. The pane runtime tracker gives the agent structural understanding of your shell state that Wave doesn't model.
vs. plain Ghostty + Claude Code
Running Claude Code inside Ghostty works. But the AI agent is a separate process that doesn't know about your terminal state. In con, the agent is part of the terminal — same process, same context, same window.
vs. AI chatbot terminals
Tools like Amazon Q Developer or GitHub Copilot CLI give you AI in the shell but without terminal integration. con gives you AI as the terminal. Different layer, different depth.
The Tech Stack Under the Hood
| Component | Technology | Why |
|---|---|---|
| Language | Rust | Performance, safety, native binaries |
| UI Framework | GPUI (Zed) | GPU rendering, canvas API, native text quality |
| Terminal Backend | Ghostty (C API) | PTY lifecycle, VT parsing, scrollback |
| Agent Framework | Rig 0.34 | Multi-provider, tool calling, structured output |
| Icons | Phosphor Icons | Consistent icon system |
| Theme | Flexoki | Warm, readable color palette |
| Fonts | Iosevka / Ioskeley Mono | Designed for code and terminal |
The crate structure is clean and modular:
crates/ ├── con/ # Main binary — GPUI app shell ├── con-core/ # Shared logic, agent harness, sessions ├── con-ghostty/ # Ghostty FFI — C API bindings ├── con-terminal/ # Terminal themes and palette data └── con-agent/ # AI harness (Rig, tools, context)
785 commits deep, with active development across all three platforms. The codebase is well-structured — each crate has a clear responsibility, and the Ghostty integration layer is intentionally thin.
Who Should Use con
✅ Use con if:
- You live in the terminal and want AI help without leaving it
- You use SSH, tmux, or agent CLIs daily
- You want multi-provider AI (not locked to one company)
- You care about open source and self-hosted AI
- You're on macOS and want a modern terminal that isn't Electron
- You want your AI to understand your terminal state, not just your text
❌ Skip con if:
- You're happy with your current terminal and don't need AI in it
- You need Windows support right now (it's early beta)
- You want a full IDE experience (use Zed, VS Code, or Warp instead)
- You need plugins, themes, and a massive extension ecosystem (yet)
⚡ The Verdict
con is the most thoughtfully architected AI terminal we've seen. Not because it has the most features — it explicitly doesn't — but because the features it has are built at the right abstraction level.
The pane runtime tracker alone is worth watching. Nobody else is modeling terminal state this deeply. If the team keeps shipping at this pace, con won't just be a terminal — it'll be the reference implementation for how AI should integrate with shell workflows.
Beta on macOS, preview on Linux, early on Windows. MIT licensed. Worth a star, worth an install, worth watching.
✅ Pros
- Real terminal (Ghostty), not webview
- GPU-native UI via GPUI
- 13 AI providers, not locked
- Pane runtime tracker is unique
- Open source (MIT)
- Clean Rust architecture
- One-liner install
⚠️ Cons
- macOS only stable right now
- Linux preview, Windows early beta
- No plugin/extension system yet
- Smaller community than Warp/Alacritty
- Agent requires API key setup
- New project — expect rough edges
Get Started
brew install --cask nowledge-co/tap/con-beta
Or grab it from GitHub Releases. Star the repo, file issues, contribute — it's MIT.