In Neon Genesis Evangelion, a plugsuit is the neural interface between a pilot and their Evangelion — form-fitting, minimal, and essential. Without it, synchronization doesn't happen.
plugsuits takes the same approach to AI agents. A lightweight TypeScript harness that connects any LLM to code editing, file operations, and shell execution — with nothing more than what's needed.
No framework overhead. No abstraction tax. Just the interface between model and tools.
- Any model, any provider — Drop in models via Vercel AI SDK's unified provider ecosystem
- Hashline edit engine — Deterministic file editing with hash-verified line anchors and autocorrect
- Interactive TUI — Full terminal UI with streaming, syntax highlighting, and runtime model switching
- Headless mode — JSONL event streaming for CI/CD, benchmarks, and automation
- Tool harness — File read/write/edit, glob, grep, shell execution — batteries included
- Repair escalation — Progressive error recovery for weaker models (validate → auto-repair → lenient fallback)
- Monorepo — Clean separation between the harness core and the agent implementation
- Node.js >= 22
- pnpm >= 10
- A FriendliAI API token (or any Vercel AI SDK-compatible provider)
export FRIENDLI_TOKEN=your_token_here
pnpm dlx plugsuitsgit clone https://github.com/minpeter/plugsuits.git
cd plugsuits
pnpm install
pnpm dev$ pnpm dev
Chat with AI (model: LGAI-EXAONE/K-EXAONE-236B-A23B)
Use '/help' for commands, 'ctrl-c' to quit
You: what files are in the src directory?
tool: read_file({"path":"src"})
AI: Here's what's in the src directory...
You: /help
Available commands:
/help Show this help message
/clear Clear conversation
/model Switch AI models
/reasoning Toggle reasoning mode
/translate Toggle translation mode
/render Render raw prompt
/quit Exit
pnpm run headless -- "Fix the type error in src/index.ts"Outputs structured JSONL events (user, tool_call, tool_result, assistant, error) for programmatic consumption.
plugsuits/
├── packages/
│ ├── harness/ @ai-sdk-tool/harness
│ │ └── src/ Core agent loop, message history, tool management
│ │
│ └── cea/ @ai-sdk-tool/cea
│ ├── src/
│ │ ├── entrypoints/ CLI (interactive) + headless (JSONL) runtimes
│ │ ├── tools/
│ │ │ ├── modify/ edit_file (hashline engine), write_file, delete_file
│ │ │ ├── explore/ read_file, grep, glob
│ │ │ └── execute/ shell_execute, shell_interact
│ │ └── interaction/ TUI renderer, streaming, spinner
│ └── benchmark/ Harbor terminal-bench adapter
│
└── scripts/ Benchmark and test automation
| Package | Description |
|---|---|
@ai-sdk-tool/harness |
Reusable agent harness — model-agnostic loop, tool management, message history |
@ai-sdk-tool/cea |
Code editing agent — full implementation with TUI, tools, and FriendliAI integration |
pnpm install # Install dependencies
pnpm dev # Interactive TUI (source mode)
pnpm run headless -- "Fix the bug" # Headless JSONL mode
pnpm test # Run all tests
pnpm run typecheck # Type check all packages
pnpm run check # Lint — non-mutating
pnpm run lint # Lint — auto-fix
pnpm run build # Build (harness → cea)The context compaction system can be debugged by setting environment variables:
# Enable compaction debug logging (stderr)
COMPACTION_DEBUG=1 pnpm dev
# Override the context limit to simulate a smaller context window
COMPACTION_DEBUG=1 CONTEXT_LIMIT_OVERRIDE=32768 pnpm -F plugsuits dev -- -m zai-org/GLM-5COMPACTION_DEBUG=1 enables:
[compaction-debug]logs on stderr showingneedsCompaction,speculative?, andcheckAndCompactdecisions each turnCONTEXT_LIMIT_OVERRIDEsupport — forces the context limit to the given value regardless of the model's actual limit, useful for triggering compaction with fewer messages
Both the TUI footer and the compaction engine will reflect the overridden limit. CONTEXT_LIMIT_OVERRIDE has no effect without COMPACTION_DEBUG=1.
- Vercel AI SDK — Model provider abstraction and streaming
- FriendliAI — Default model provider
- pnpm — Workspace package manager
- Turborepo — Task orchestration and caching
- TypeScript — Strict mode throughout
MIT
The name plugsuits was suggested by Simon Kim of Hashed.
"Plug and go, native like a suit" — like a plugsuit synchronizing a pilot with their Eva,
this harness synchronizes AI models with the tools they need.
