Skip to content

sourcebridge-ai/sourcebridge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

821 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

SourceBridge.ai

SourceBridge.ai

The field guide your codebase should have had.

CI License: AGPL-3.0 Go Release Docker Pulls

What is SourceBridge?

SourceBridge is a requirement-aware code comprehension platform. Point it at any codebase and it generates field guides -- cliff notes, learning paths, code tours, architecture diagrams, and workflow stories -- so your team can understand how a system actually works. It also traces requirements to code, runs AI-powered reviews, and serves as an MCP server for AI agent integration.

Most tools help you search code. SourceBridge helps you understand systems.

Repository overview showing cliff notes, code tours, and learning paths

More screenshots

Cliff Notes

AI-generated cliff notes with hierarchical code summaries

Admin Monitor

LLM job queue with real-time generation progress

Semantic Search

Natural language search against the repository graph

Key Features

  • Code Indexing -- Tree-sitter based parsing for Go, Python, TypeScript, JavaScript, Java, Rust, and C++
  • Field Guides -- Cliff notes, learning paths, code tours, workflow stories, and system explanations at repository, file, and symbol levels
  • Requirement Tracing -- Import requirements from Markdown or CSV, auto-link to code, generate traceability matrices
  • Code Review -- AI-powered structured reviews (security, SOLID, performance, reliability, maintainability)
  • Code Discussion -- Conversational exploration with full codebase context
  • Architecture Diagrams -- Auto-generated Mermaid diagrams from code structure
  • Impact Analysis -- Simulate changes and see affected requirements and code paths
  • MCP Server -- Model Context Protocol support for AI agent integration (setup guide)
  • VS Code Extension -- First-class editor integration: inline requirement lenses, streaming AI chat (Cmd+I), create/edit requirements from the sidebar, one-keystroke field guides (install guide)
  • Multi-Provider LLM -- Works with cloud APIs (Anthropic, OpenAI, Gemini, OpenRouter) or fully local inference (Ollama, vLLM, llama.cpp, SGLang, LM Studio)
  • GraphQL API -- Full programmatic access to all platform capabilities
  • CLI -- Complete command-line interface for scripting and automation

Quick Start

Docker Hub (fastest)

No git clone needed. Just pull and run:

# Download the compose file
curl -O https://raw.githubusercontent.com/sourcebridge-ai/sourcebridge/main/docker-compose.hub.yml

# Start SourceBridge (uses Ollama by default — no API key required)
docker compose -f docker-compose.hub.yml up -d

Open http://localhost:3000 and create your admin account.

Security: set your secrets before exposing SourceBridge to a network.

The default compose file uses sentinel values (INSECURE-DEFAULT-CHANGE-ME-NOW) for the database password, gRPC shared secret, and JWT signing key. These are intentionally loud and self-identifying — the API server will emit a repeating warning until they are replaced.

Run the init script once before starting the stack:

# Generates .env with strong random values (chmod 0600)
curl -O https://raw.githubusercontent.com/sourcebridge-ai/sourcebridge/main/scripts/init-hub-secrets.sh
chmod +x init-hub-secrets.sh && ./init-hub-secrets.sh

# Then start the stack — it picks up .env automatically
docker compose -f docker-compose.hub.yml up -d

Environment variables the init script sets:

Variable Purpose
SURREAL_PASS SurrealDB admin password (also controls SURREAL_USER, default root)
SOURCEBRIDGE_GRPC_SECRET Shared secret between the API server and the AI worker
SOURCEBRIDGE_JWT_SECRET JWT signing key for all user sessions

To rotate secrets later, run ./init-hub-secrets.sh --force. All active sessions will be invalidated.

Encryption key: auto-generated on first boot, persisted in a named volume.

The encryption-key-init service in docker-compose.hub.yml generates a unique 32-byte key on first boot and writes it to the sourcebridge-secrets named volume. Subsequent up calls reuse the existing key — no environment variable setup needed for encryption.

docker compose down -v deletes both data and the encryption key, which means all stored API keys (saved through /admin/llm) are unrecoverable without the key. Back up the sourcebridge-secrets volume before any -v teardown. See docs/admin/llm-config.md for the wipe-and-re-enter procedure when you need to start fresh.

Using a cloud LLM? Pass your provider config inline:

SOURCEBRIDGE_LLM_PROVIDER=anthropic \
SOURCEBRIDGE_LLM_API_KEY=sk-ant-... \
SOURCEBRIDGE_LLM_MODEL=claude-sonnet-4-20250514 \
docker compose -f docker-compose.hub.yml up -d
Full configuration examples

Anthropic (recommended for quality)

SOURCEBRIDGE_LLM_PROVIDER=anthropic \
SOURCEBRIDGE_LLM_API_KEY=sk-ant-api03-xxxxx \
SOURCEBRIDGE_LLM_MODEL=claude-sonnet-4-20250514 \
docker compose -f docker-compose.hub.yml up -d

OpenAI

SOURCEBRIDGE_LLM_PROVIDER=openai \
SOURCEBRIDGE_LLM_API_KEY=sk-xxxxx \
SOURCEBRIDGE_LLM_MODEL=gpt-4o \
docker compose -f docker-compose.hub.yml up -d

Ollama (local, free)

# Install Ollama first: https://ollama.com
ollama pull qwen3:32b

# No API key needed — Ollama is the default
docker compose -f docker-compose.hub.yml up -d

Quality gates: Living Wiki evaluates generated pages against tier-aware thresholds. qwen3:32b is classified as TierLocal automatically, which uses relaxed thresholds appropriate for open-weight models. No configuration required for the default install. To override the tier for a specific model, set it in Admin → Comprehension → Model Registry. See docs/admin/llm-config.md.

OpenRouter (access to 100+ models)

SOURCEBRIDGE_LLM_PROVIDER=openrouter \
SOURCEBRIDGE_LLM_API_KEY=sk-or-xxxxx \
SOURCEBRIDGE_LLM_MODEL=anthropic/claude-sonnet-4-20250514 \
docker compose -f docker-compose.hub.yml up -d

Production secrets

Run scripts/init-hub-secrets.sh (recommended) to generate a .env with strong random values for SURREAL_PASS, SOURCEBRIDGE_GRPC_SECRET, and SOURCEBRIDGE_JWT_SECRET. The compose file reads them automatically.

./scripts/init-hub-secrets.sh          # creates .env (chmod 0600)
docker compose -f docker-compose.hub.yml up -d

Or set them inline without a .env:

SURREAL_PASS=$(openssl rand -hex 32) \
SOURCEBRIDGE_JWT_SECRET=$(openssl rand -hex 32) \
SOURCEBRIDGE_GRPC_SECRET=$(openssl rand -hex 32) \
SOURCEBRIDGE_LLM_PROVIDER=anthropic \
SOURCEBRIDGE_LLM_API_KEY=sk-ant-api03-xxxxx \
SOURCEBRIDGE_LLM_MODEL=claude-sonnet-4-20250514 \
docker compose -f docker-compose.hub.yml up -d

Try the demo

SourceBridge includes a sample TypeScript project you can index immediately:

git clone https://github.com/sourcebridge-ai/sourcebridge.git
cd sourcebridge
./demo.sh

The demo starts SourceBridge, indexes a 44-file sample API, and generates cliff notes, code tours, and architecture diagrams. Open http://localhost:3000 to explore.

CLI — one line, any platform

curl -fsSL https://<your-sourcebridge-server>/install.sh | sh -s -- --server https://<your-sourcebridge-server>

Installs to ~/.local/bin/sourcebridge (no sudo), authenticates against your server, and you're ready to run sourcebridge setup claude in any indexed repository. See Installation for the trust model, alternate paths (brew install sourcebridge-ai/tap/sourcebridge, manual download, build from source), and upgrade/uninstall instructions.

Server — Docker compose

git clone https://github.com/sourcebridge-ai/sourcebridge.git
cd sourcebridge
cp .env.example .env   # configure your LLM provider
docker compose up -d

Helm / Kubernetes

For production deployments:

helm install sourcebridge deploy/helm/sourcebridge/ \
  --set llm.provider=anthropic \
  --set llm.apiKey=$ANTHROPIC_API_KEY

See Helm Guide for full configuration options, including air-gapped and local inference setups.

VS Code Extension

Use SourceBridge without leaving your editor. The extension is in plugins/vscode/ and talks to any SourceBridge server — local, Docker, Helm, or a shared team deployment.

Inline requirement lenses · streaming AI chat · sidebar CRUD · one-keystroke field guides

Install

From a pre-built VSIX:

# Build the VSIX from source
make package-vscode

# Install it into your local VS Code
make install-vscode

Or from inside VS Code: Cmd+Shift+PExtensions: Install from VSIX… → pick plugins/vscode/sourcebridge-*.vsix.

Configure

  1. Cmd+Shift+PSourceBridge: Sign In
  2. Enter your server URL (e.g. http://localhost:8080 for local dev, or your team's deployed URL)
  3. Pick sign-in method (browser OIDC or local password)

The status bar (bottom-left) reflects connection state: connected · <repo>, offline · retry in Ns, sign in required, etc. Click it for quick actions.

Highlights

Flow How What happens
Ask streaming Cmd+I on any selection Chat panel opens; tokens stream in live via MCP (fallback: non-streaming GraphQL if server doesn't mount MCP)
Show linked requirements Cmd+. on a function Lightbulb menu lists linked requirements; click to open detail panel
Create requirement Cmd+. on an unlinked symbol → Create requirement from this symbol… Inline flow pre-fills title from the symbol name; the new requirement is linked automatically
Edit / delete from sidebar Hover a requirement row in the activity bar Pencil + trash icons; delete soft-deletes (30-day recycle bin)
Field guide for file Cmd+K N Generates cliff notes for the active file and opens the panel
Change Risk tree Activity bar → Change Risk Shows changed files / affected requirements / stale field guides from the latest impact report
Scoped palette Cmd+Shift+; Context-filtered picker — only shows actions valid for your current focus

Full feature list + troubleshooting in plugins/vscode/README.md.

Develop / contribute

cd plugins/vscode
npm install
npm run watch      # rebuilds on save
# Open the folder in VS Code, press F5 → "Run Extension" for a dev host

Tests: make test-vscode (or npm test from inside plugins/vscode/).

Architecture

                    ┌──────────────────────────────────┐
                    │           Clients                │
                    │   Web UI / CLI / MCP / GraphQL   │
                    └──────────────┬───────────────────┘
                                   │
                    ┌──────────────▼───────────────────┐
                    │        Go API Server             │
                    │   chi router + gqlgen GraphQL    │
                    │   JWT auth, OIDC SSO, REST       │
                    │   tree-sitter code indexer        │
                    └───────┬──────────────┬───────────┘
                            │              │
               ┌────────────▼──┐    ┌──────▼──────────┐
               │   SurrealDB   │    │  Python Worker   │
               │   (embedded   │    │  gRPC service    │
               │   or external)│    │  AI reasoning,   │
               └───────────────┘    │  linking,        │
                                    │  requirements,   │
               ┌───────────────┐    │  knowledge,      │
               │  Redis Cache  │    │  contracts       │
               │  (optional,   │    └──────┬───────────┘
               │  defaults to  │           │
               │  in-memory)   │    ┌──────▼───────────┐
               └───────────────┘    │   LLM Provider   │
                                    │  Cloud or Local   │
                                    └──────────────────┘

Go API Server (internal/, cmd/) -- HTTP and GraphQL API, authentication, code indexing, and request routing. Handles tree-sitter parsing for 7 languages.

Python gRPC Worker (workers/) -- AI reasoning engine that communicates with LLM providers. Services include reasoning, linking, requirements analysis, knowledge extraction, and contract generation.

Next.js Web UI (web/) -- React 19, Tailwind CSS, CodeMirror 6 for code display, @xyflow/react for dependency graphs, recharts for metrics, Mermaid for architecture diagrams.

SurrealDB -- Primary data store. Runs embedded for single-node setups or connects to an external instance for production.

Redis -- Optional caching layer. Defaults to an in-memory cache when Redis is not configured.

Configuration

SourceBridge reads configuration from a TOML config file and environment variables. Environment variables use the SOURCEBRIDGE_ prefix and override file values.

See config.toml.example for a complete annotated example.

Key Environment Variables

Variable Description Default
SOURCEBRIDGE_LLM_PROVIDER LLM provider name ollama
SOURCEBRIDGE_LLM_BASE_URL LLM API endpoint (provider default)
SOURCEBRIDGE_LLM_MODEL Model name (provider default)
SOURCEBRIDGE_LLM_API_KEY API key for cloud providers --
SOURCEBRIDGE_SERVER_HTTP_PORT API server port 8080
SOURCEBRIDGE_SERVER_GRPC_PORT gRPC port for worker communication 50051
SOURCEBRIDGE_STORAGE_SURREAL_MODE embedded or external embedded
SOURCEBRIDGE_STORAGE_SURREAL_URL SurrealDB connection URL --
SOURCEBRIDGE_STORAGE_REDIS_MODE redis or memory memory
SOURCEBRIDGE_SECURITY_JWT_SECRET JWT signing secret (required for auth)
SOURCEBRIDGE_SECURITY_MODE Security mode (oss or enterprise) oss

OSS vs Enterprise tenancy

SourceBridge OSS uses single-tenant mode: all repositories are owned by the built-in tenant=default principal. Every authenticated user can access every repository — there is no per-user or per-team repo isolation.

This is the intended OSS posture for small teams and individual use.

Multi-user deployments that require repo isolation (i.e. user A cannot see user B's repositories) need the enterprise edition, which adds per-tenant RepoAccessMiddleware enforcement.

The API server emits a one-time oss_single_tenant_mode warning at boot as a reminder. This is informational, not an error — it confirms that the OSS posture is active and expected.

LLM Providers

SourceBridge supports both cloud-hosted and local inference providers. Configure per-operation models for cost optimization (e.g., a smaller model for summaries, a larger one for reviews).

Cloud Providers

Provider Config Value API Key Variable Models
Anthropic anthropic ANTHROPIC_API_KEY Claude Sonnet 4, Claude Haiku, etc.
OpenAI openai OPENAI_API_KEY GPT-4o, GPT-4o-mini, etc.
Google Gemini gemini GOOGLE_API_KEY Gemini 2.5 Pro, Flash, etc.
OpenRouter openrouter OPENROUTER_API_KEY Any model on OpenRouter

Local Inference

Provider Config Value Notes
Ollama ollama Easiest local setup. Pull a model and go.
vLLM vllm High-throughput serving with PagedAttention
llama.cpp llamacpp CPU/GPU inference, GGUF models
SGLang sglang Optimized serving with RadixAttention
LM Studio lmstudio Desktop app with OpenAI-compatible API

All local providers expose an OpenAI-compatible API. Set base_url to the local endpoint.

Using with Claude Code

After indexing a repository, generate a .claude/CLAUDE.md skill card that gives Claude Code a structured map of the codebase — per-subsystem sections, call-graph-derived warnings, and representative symbols — so the agent understands the architecture before refactoring:

sourcebridge setup claude --repo-id <id>

This writes three files into the repository:

  • .claude/CLAUDE.md — the skill card with ## Subsystem: sections derived from clustering data
  • .claude/sourcebridge.json — metadata for future refreshes (gitignored by default)
  • .mcp.json — Claude Code MCP server configuration so the agent can call SourceBridge tools directly

Re-run the command after re-indexing to refresh the skill card.

See Claude Code memory documentation for how Claude Code reads .claude/CLAUDE.md.

CLI Reference

Command Description
sourcebridge serve Start the API server
sourcebridge index <path> Index a repository with tree-sitter
sourcebridge setup claude Generate a .claude/CLAUDE.md skill card for Claude Code
sourcebridge import <file> Import requirements from Markdown or CSV
sourcebridge trace <req-id> Trace a requirement to linked code
sourcebridge review <path> Run an AI-powered code review
sourcebridge ask <question> Ask a question about the codebase

See CLI Reference for full flag documentation.

Development

Prerequisites

  • Go 1.25+
  • Python 3.12+ with uv
  • Node.js 22+
  • Git

Building from Source

# Clone the repository
git clone https://github.com/sourcebridge-ai/sourcebridge.git
cd sourcebridge

# Build the Go API server
make build-go

# Install Python worker dependencies
make build-worker

# Build the web UI
make build-web

# Or build everything at once
make build

Container Images

Pre-built images are published on every push to main (and on v* tags) to both registries below, with identical tags:

Registry Image
GitHub Container Registry ghcr.io/sourcebridge-ai/sourcebridge-{api,worker,web}
Docker Hub sourcebridge/sourcebridge-{api,worker,web}

Tag policy:

  • sha-<short> — the seven-character commit SHA of the build (e.g. sha-9d15856)
  • latest — moves with the latest main commit
  • vX.Y.Z and stable — set when a v* tag is pushed (no prerelease suffix)

For local image builds (mirrors the CI tag policy), see scripts/build-and-deploy.sh--help for usage.

Running Locally

SourceBridge runs three processes in development. The API server embeds SurrealDB by default, so you do not need a separate database container — just start the three processes below in their own terminals and you have a working stack.

# Terminal 1 — API server (builds first)
make dev

# Terminal 2 — Next.js web UI
make dev-web

# Terminal 3 — Python AI worker (required for agentic features,
# embeddings, and code review). Run after `make build-worker`.
make dev-worker

The worker can also be invoked directly via the registered console script: cd workers && uv run sourcebridge-worker. Both forms are equivalent; make dev-worker is the canonical entry point used throughout the docs.

Start the worker before (or shortly after) the API server. Agentic and embedding features activate when the worker is reachable on localhost:50051; without the worker the API server still serves indexing, browsing, and CRUD.

Testing

# Run all tests (Go + web + worker)
make test

# Run linting (Go + web + worker)
make lint

# Run CI checks locally (lint + test)
make ci

See CONTRIBUTING.md for the full development workflow.

Deployment

Docker Compose

Best for evaluation and small teams. See Docker Compose quick start above.

Kubernetes with Helm

For production and multi-team deployments:

Migrating external uptime monitors

Prior to Phase 5, the paths /healthz and /readyz were exposed via the public Ingress. They have been removed from public routing — those paths still exist on the API pod (kubelet probes them directly via pod IP), but they are no longer reachable at https://<host>/healthz or https://<host>/readyz.

If you run an external uptime check against either of those paths, update it to:

https://<host>/api/health

This endpoint returns {"ok":true} with HTTP 200 and requires no authentication. It was added in the same release cycle and is the intended public health signal for external monitors, CDN origin checks, and load balancer health probes that operate outside the cluster.

Versioning

Every SourceBridge build carries a version string that's derived from git — no hand-editing, no manifest bumping. The same string is reported by the running API server, the Python worker, the Next.js web bundle, and the OCI image labels.

Build context Version Example
Tagged release (git tag v1.2.3) <tag> v1.2.3
main between releases <tag>-dev.<N>+g<sha> v0.9.0-rc.3-dev.216+g956607e
Pull-request build <tag>-pr<NUMBER>+g<sha> v0.9.0-rc.3-pr147+g956607e
Local make build <tag>-local+g<sha>[.dirty] v0.9.0-rc.3-local+g956607e.dirty
No git context (source tarball) 0.0.0-unknown 0.0.0-unknown

The grammar is implemented by scripts/version.sh, which the Makefile, the Dockerfiles, and the GitHub Actions workflows all consume. To find your build:

  • Web UI: sidebar footer (always visible) or Admin → System status → Build info (full payload + copy-to-clipboard for support tickets).
  • HTTP: curl https://<your-server>/api/v1/version (no auth required).
  • CLI: sourcebridge --version.
  • Image: docker inspect <image> | jq '.[0].Config.Labels' — surfaces org.opencontainers.image.version and friends.

To cut a release: push a v* tag. The release workflow takes care of binary builds, image labels, the Homebrew tap formula, and the GitHub release page. There is no version file to bump by hand.

See docs/admin/build-info.md for the full build-info reference (debugging mismatches between the web bundle and the API server during a rolling deploy, etc.).

Documentation

Contributing

Contributions are welcome. See CONTRIBUTING.md for development setup, coding standards, and the pull request process.

First-time contributors must agree to the Contributor License Agreement before their PR can be merged.

License

SourceBridge is licensed under the GNU Affero General Public License v3.0.


Did it work?

If SourceBridge helped you understand a codebase, let us know: