LLM CLI Bridge is an MCP server that lets Claude orchestrate Gemini CLI or Codex CLI for tasks that benefit from larger context windows, backend-specific strengths, or multi-round review workflows.
Why This Exists
Claude is strong at orchestration and reasoning. Gemini is useful when you want very large context handling. Codex is useful when you want GPT-based code review and coding workflows. This tool gives Claude a single MCP surface that can delegate to either backend without forcing you to change your workflow.
Tools
ask
Query Gemini or Codex with @ file references. Supports sessions, change mode, and backend selection.
ask gemini to analyze @src/main.js
ask codex to analyze @src/main.js
use codex with high reasoning to review @package.jsonbrainstorm
Structured ideation with multiple methodologies and optional backend selection.
- Auto - Let the tool pick the right approach
- Divergent - Generate many ideas quickly
- Convergent - Narrow and refine options
- SCAMPER - Systematic prompt-based ideation
- Design Thinking - Human-centered exploration
- Lateral - Unexpected angles and connections
review-code
Interactive, multi-round code review with git-aware sessions and comment decision tracking.
- Multiple review types: general, security, performance, quality, architecture
- Severity filters for tighter review passes
- Follow-up rounds with accept/reject/modify/defer decisions
Installation
# Recommended
claude mcp add llm-cli -- npx -y @maxanatsko/llm-cli-bridge@latest
# Verify installation
/mcpClaude Desktop Configuration
{
"mcpServers": {
"llm-cli": {
"command": "npx",
"args": ["-y", "@maxanatsko/llm-cli-bridge@latest"]
}
}
}Backends
LLM CLI Bridge supports both backend families:
- Gemini CLI - large-context analysis and research workflows
- Codex CLI - GPT-based code review and coding workflows
You can switch backends at the prompt level instead of maintaining separate MCP servers.