Stop treating code as text. Give your AI agent a compiler's understanding.
mcpls is a universal bridge between AI coding assistants and language servers. It exposes the full power of LSP — type inference, cross-reference analysis, semantic navigation — through the Model Context Protocol, enabling AI agents to reason about code the way IDEs do.
AI coding assistants are remarkably capable, but they're working blind. They see code as text, not as the structured, typed, interconnected system it actually is. When Claude asks "what type is this variable?" it's guessing from context. When it refactors a function, it's hoping it found all the callers.
mcpls changes that. By bridging MCP and LSP, it gives AI agents access to:
- Type information — Know exactly what a variable is, not what it might be
- Cross-references — Find every usage of a symbol across your entire codebase
- Semantic navigation — Jump to definitions, implementations, type declarations
- Real diagnostics — See actual compiler errors, not hallucinated ones
- Intelligent completions — Get suggestions that respect scope and types
- Safe refactoring — Rename symbols with confidence, workspace-wide
- Graceful degradation — Use available language servers, even if some fail to initialize
Tip
Zero configuration for Rust projects. Just install mcpls and a language server — ready to go.
Tip
mcpls uses graceful degradation — if one language server fails or isn't installed, it continues with available servers. You don't need all servers installed.
For Rust projects, install rust-analyzer:
# Via rustup (recommended)
rustup component add rust-analyzer
# Or via Homebrew (macOS)
brew install rust-analyzer
# Or via package manager (Linux)
# Ubuntu/Debian: sudo apt install rust-analyzer
# Arch: sudo pacman -S rust-analyzerImportant
At least one language server must be available. Without any configured servers or if all fail to initialize, mcpls will return a clear error message.
cargo install mcplsDownload from GitHub Releases:
| Platform | Architecture | Download |
|---|---|---|
| Linux | x86_64 | mcpls-linux-x86_64.tar.gz |
| Linux | x86_64 (static) | mcpls-linux-x86_64-musl.tar.gz |
| macOS | Intel | mcpls-macos-x86_64.tar.gz |
| macOS | Apple Silicon | mcpls-macos-aarch64.tar.gz |
| Windows | x86_64 | mcpls-windows-x86_64.zip |
git clone https://github.com/bug-ops/mcpls
cd mcpls
cargo install --path crates/mcpls-climcpls recognizes 30 programming languages by default. You can customize file extension mappings to:
- Add support for specialized file types (e.g.,
.nufor Nushell) - Override default associations (e.g., use a custom Rust server)
- Reduce memory usage by including only languages you use
mcpls searches for configuration files in this order:
| Platform | Default Location | Alternative |
|---|---|---|
| Linux | ~/.config/mcpls/mcpls.toml |
./mcpls.toml |
| macOS | ~/.config/mcpls/mcpls.toml |
~/Library/Application Support/mcpls/mcpls.toml |
| Windows | %APPDATA%\mcpls\mcpls.toml |
.\mcpls.toml |
| Any | --config /path/to/file |
$MCPLS_CONFIG env var |
[workspace]
roots = [] # Auto-detect
# Add Nushell language support
[[language_extensions]]
extensions = ["nu"]
language_id = "nushell"
# Rust is recognized by default, but you can override
[[language_extensions]]
extensions = ["rs"]
language_id = "rust"[workspace]
roots = []
# Only configure languages you actually use
[[language_extensions]]
extensions = ["rs"]
language_id = "rust"
[[language_extensions]]
extensions = ["py", "pyi"]
language_id = "python"
[[language_extensions]]
extensions = ["ts", "tsx"]
language_id = "typescript"See the full example configuration for all supported languages and options.
Add mcpls to your MCP configuration (~/.claude/claude_desktop_config.json):
{
"mcpServers": {
"mcpls": {
"command": "mcpls",
"args": []
}
}
}For languages beyond Rust, create a configuration file. mcpls auto-creates a default config with 30 language mappings on first run.
Linux/macOS:
mkdir -p ~/.config/mcpls
cat > ~/.config/mcpls/mcpls.toml << 'EOF'
[[lsp_servers]]
language_id = "python"
command = "pyright-langserver"
args = ["--stdio"]
file_patterns = ["**/*.py"]
[[lsp_servers]]
language_id = "typescript"
command = "typescript-language-server"
args = ["--stdio"]
file_patterns = ["**/*.ts", "**/*.tsx"]
[language_extensions]
# Custom file extension mappings (optional)
# Example: map .nushell files to nushell language
# nushell = ["nushell", ".nushell", ".nu"]
EOFmacOS (alternative XDG location):
mkdir -p ~/Library/Application\ Support/mcpls
# Copy or create mcpls.toml in ~/Library/Application Support/mcpls/Note
macOS users: Configuration is stored in ~/Library/Application Support/mcpls/ by default. You can also use ~/.config/mcpls/ or set $MCPLS_CONFIG to a custom path.
You: What's the return type of process_request on line 47?
Claude: [get_hover] It returns Result<Response, ApiError> where:
- Response is defined in src/types.rs:23
- ApiError is an enum with variants: Network, Parse, Timeout
You: Find everywhere ApiError::Timeout is handled
Claude: [get_references] Found 4 matches:
- src/handlers/api.rs:89 — retry logic
- src/handlers/api.rs:156 — logging
- src/middleware/timeout.rs:34 — wrapper
- tests/api_tests.rs:201 — test case
| Tool | What it does |
|---|---|
get_hover |
Type signatures, documentation, inferred types at any position |
get_definition |
Jump to where a symbol is defined — across files, across crates |
get_references |
Every usage of a symbol in your workspace |
get_completions |
Context-aware suggestions that respect types and scope |
get_document_symbols |
Structured outline — functions, types, constants, imports |
workspace_symbol_search |
Find symbols by name across the entire workspace |
| Tool | What it does |
|---|---|
get_diagnostics |
Real compiler errors and warnings, not guesses |
get_cached_diagnostics |
Fast access to push-based diagnostics from LSP server |
get_code_actions |
Quick fixes, refactorings, and source actions at a position |
| Tool | What it does |
|---|---|
rename_symbol |
Workspace-wide rename with full reference tracking |
format_document |
Apply language-specific formatting rules |
| Tool | What it does |
|---|---|
prepare_call_hierarchy |
Get callable items at a position for call hierarchy |
get_incoming_calls |
Find all callers of a function (who calls this?) |
get_outgoing_calls |
Find all callees of a function (what does this call?) |
| Tool | What it does |
|---|---|
get_server_logs |
Debug LSP issues with internal log messages |
get_server_messages |
User-facing messages from the language server |
| Variable | Description | Default |
|---|---|---|
MCPLS_CONFIG |
Path to configuration file | Auto-detected |
MCPLS_LOG |
Log level (trace, debug, info, warn, error) | info |
MCPLS_LOG_JSON |
Output logs as JSON | false |
[workspace]
roots = ["/path/to/project"]
position_encodings = ["utf-8", "utf-16"]
[[lsp_servers]]
language_id = "rust"
command = "rust-analyzer"
args = []
file_patterns = ["**/*.rs"]
timeout_seconds = 30
[lsp_servers.initialization_options]
cargo.features = "all"
checkOnSave.command = "clippy"
[lsp_servers.env]
RUST_BACKTRACE = "1"
[language_extensions]
# Custom extension mappings (optional)
# Format: extension_without_dot = ["language_id", ".ext1", ".ext2"]
nushell = ["nushell", ".nu", ".nushell"]Note
See Configuration Reference for all options including the 30 built-in language extension mappings.
mcpls works with any LSP 3.17 compliant server. Battle-tested with:
| Language | Server | Notes |
|---|---|---|
| Rust | rust-analyzer | Zero-config, built-in support |
| Python | pyright | Full type inference |
| TypeScript/JS | typescript-language-server | JSX/TSX support |
| Go | gopls | Modules and workspaces |
| C/C++ | clangd | compile_commands.json |
| Java | jdtls | Maven/Gradle projects |
| Nushell | nushell-lsp | Custom extension mapping |
| And 24+ others | Any LSP 3.17 server | See Configuration Reference for full list |
Tip
By default, mcpls includes 30 language-to-extension mappings. To add custom mappings (e.g., .nu → nushell), edit the language_extensions section in your config.
flowchart TB
subgraph AI["AI Agent (Claude)"]
end
subgraph mcpls["mcpls Server"]
MCP["MCP Server<br/>(rmcp)"]
Trans["Translation Layer"]
LSP["LSP Clients<br/>Manager"]
MCP --> Trans --> LSP
end
subgraph Servers["Language Servers"]
RA["rust-analyzer"]
PY["pyright"]
TS["tsserver"]
Other["..."]
end
AI <-->|"MCP Protocol<br/>(JSON-RPC 2.0)"| mcpls
mcpls <-->|"LSP Protocol<br/>(JSON-RPC 2.0)"| Servers
Key design decisions:
- Single binary — No Node.js, Python, or other runtime dependencies
- Async-first — Tokio-based, handles multiple LSP servers concurrently
- Memory-safe — Pure Rust, zero
unsafeblocks - Resource-bounded — Configurable limits on documents and file sizes
- Getting Started — First-time setup
- Configuration Reference — All options explained
- Tools Reference — Deep dive into each tool
- Troubleshooting — Common issues and solutions
- Installation Guide — Platform-specific instructions
# Build
cargo build
# Test (uses nextest for speed)
cargo nextest run
# Run locally
cargo run -- --log-level debugRequirements: Rust 1.85+ (Edition 2024)
Contributions welcome. See CONTRIBUTING.md for guidelines.
Dual-licensed under Apache 2.0 or MIT at your option.
mcpls — Because AI deserves to understand code, not just read it.