Skip to content

philbudden/synapse

Repository files navigation

Synapse

Synapse is a personal memory inbox you can talk to.

  • You write a thought.
  • Synapse stores it in a database.
  • Later you can search it by meaning (semantic search).

This repo runs everything in Docker except Ollama (embeddings), which you run separately.

What you’ll run

  • Ollama (on your machine or a remote host) — creates embeddings
  • Synapse API (Docker) — stores and searches memories
  • Postgres (Docker) — database
  • Synapse MCP server (Docker) — lets LLM clients use Synapse as tools
  • Matrix homeserver (Synapse) (Docker, optional) — so you can capture memories from Element X
  • Matrix capture bot (Docker, optional) — turns Matrix room messages into stored memories

Prerequisites

  1. Install Docker Desktop (or Docker Engine + Compose).
  2. Install Ollama.
  3. Ensure the embedding model exists:
ollama pull nomic-embed-text

If you want to use a remote Ollama (example: 100.104.36.96): set OLLAMA_HOST in .env (steps below).


1) Start Synapse (API + DB + MCP)

Step 1 — Create your config

cp .env.example .env

Edit .env if needed:

  • Local Ollama (default on macOS/Windows Docker Desktop):
    • OLLAMA_HOST=host.docker.internal
    • OLLAMA_PORT=11434
  • Remote Ollama example:
    • OLLAMA_HOST=100.104.36.96
    • OLLAMA_PORT=11434

Step 2 — Start the stack

docker compose up -d --build

Step 3 — Verify it’s running

curl -fsS http://localhost:8000/health | cat
curl -fsS http://localhost:8080/health | cat

You should have:

  • API: http://localhost:8000
  • MCP: http://localhost:8080/mcp

Step 4 — Quick API test (optional)

Capture a memory:

curl -sS -X POST http://localhost:8000/capture \
  -H 'Content-Type: application/json' \
  -d '{"content":"buy better coffee beans"}' | cat

Search:

curl -sS 'http://localhost:8000/search?query=coffee&limit=5' | cat

Structured memory (authoritative JSON):

curl -sS 'http://localhost:8000/structured_memory/infrastructure' | cat

2) Capture memories using Matrix (Element X)

This repo can run a full Matrix homeserver (Synapse) in Docker.

Important note about Element X “Sign up”

Many Element X builds won’t create accounts on classic-password homeservers unless you add Matrix Authentication Service (MAS).

This setup is still usable:

  • Create users on the server (one-time command)
  • Then use Sign in in Element X

Step 1 — Start Matrix services

docker compose --profile matrix up -d --build

Matrix will be available locally at:

  • http://localhost:8008

Step 2 — Create your Matrix user

Replace alice / password as you like:

docker compose --profile matrix exec -T matrix-synapse \
  register_new_matrix_user -c /data/homeserver.yaml \
  -u alice -p 'change-me' http://localhost:8008

Step 3 — Connect from Element X (recommended: HTTP for LAN/VPN)

On your phone (same home network / VPN / Tailscale):

  1. Open Element X
  2. Choose Sign in (not Sign up)
  3. Homeserver:
    • http://<your-mac-or-server-hostname>:8008
    • Example: http://mac-workstation:8008
  4. Username:
    • @alice:<MATRIX_SERVER_NAME> (default @alice:localhost)
  5. Password:
    • the one you set

Step 4 — Turn Matrix messages into stored memories (Matrix bot)

The matrix-bot service watches one room and stores every message as a memory.

4a) Create a bot user

docker compose --profile matrix exec -T matrix-synapse \
  register_new_matrix_user -c /data/homeserver.yaml \
  -u synapsebot -p 'bot-password' http://localhost:8008

4b) Get the bot access token

Run on your host:

curl -sS -X POST http://localhost:8008/_matrix/client/v3/login \
  -H 'Content-Type: application/json' \
  -d '{"type":"m.login.password","user":"synapsebot","password":"bot-password"}' | cat

If login fails, try the full Matrix ID as the user:

curl -sS -X POST http://localhost:8008/_matrix/client/v3/login \
  -H 'Content-Type: application/json' \
  -d '{"type":"m.login.password","user":"@synapsebot:localhost","password":"bot-password"}' | cat

Copy the access_token from the response.

4c) Create a room (your “inbox”) and get its Room ID

  1. Create a room (e.g. “Synapse Inbox”)
  2. Turn OFF encryption for this room (create an unencrypted room).

Why: Element X often creates encrypted rooms by default. This minimal matrix-bot does not support decrypting E2EE messages, so it won’t see your text (and nothing will be stored).

Note: if you already created an encrypted room, you generally can’t disable encryption later — create a new unencrypted room for your Synapse inbox.

Element X doesn’t always show the raw Room ID in the UI. You can fetch it from the homeserver.

First, log in to the homeserver as your normal user (replace username/password):

curl -sS --max-time 10 -X POST http://localhost:8008/_matrix/client/v3/login \
  -H 'Content-Type: application/json' \
  -d '{"type":"m.login.password","user":"alice","password":"change-me"}'

From that JSON response, copy the access_token.

Now list the rooms you’ve joined — the returned values are Room IDs:

  • If you have jq installed:
curl -sS --max-time 10 http://localhost:8008/_matrix/client/v3/joined_rooms \
  -H "Authorization: Bearer <PASTE_ACCESS_TOKEN>" \
| jq -r '.joined_rooms[]'
  • No jq installed (uses Python):
curl -sS --max-time 10 http://localhost:8008/_matrix/client/v3/joined_rooms \
  -H "Authorization: Bearer <PASTE_ACCESS_TOKEN>" \
| python3 -c 'import sys,json; print("\n".join(json.load(sys.stdin)["joined_rooms"]))'

Tip: send a message in your “Synapse Inbox” room first, then run the command — it’ll be easier to identify the right room.

4d) Configure + start the bot

Set these in .env (replace the room ID and token):

MATRIX_HOMESERVER=http://matrix-synapse:8008
MATRIX_USER_ID=@synapsebot:localhost
MATRIX_ACCESS_TOKEN=PASTE_TOKEN_HERE
MATRIX_ROOM_ID=!yourRoomId:localhost

Restart the bot:

docker compose --profile matrix up -d --build matrix-bot

Note: the bot can only accept invites when it’s running and has MATRIX_USER_ID + MATRIX_ACCESS_TOKEN configured.

4e) Invite the bot to the room

In Element X, invite @synapsebot:<MATRIX_SERVER_NAME> to the room.

  • If you already invited it earlier: after the bot is configured and restarted, it should accept automatically within ~30 seconds.
  • If it stays stuck under “Invited”, check the bot logs:
docker compose --profile matrix logs --tail=200 matrix-bot

4f) Test

Post a message in the room. The bot should reply with the stored memory ID (and category).


3) Connect Synapse MCP to other LLM services

Synapse exposes MCP Streamable HTTP at:

  • On the same machine: http://localhost:8080/mcp
  • From another device on your home network / VPN / Tailscale: http://<your-host-ip>:8080/mcp

Example (LAN): http://192.168.1.50:8080/mcp Example (Tailscale): http://100.x.y.z:8080/mcp

Notes:

  • The container publishes port 8080 on your host. If another device can’t reach it, check your firewall/router rules.
  • This project has no authentication by default. Only expose MCP to trusted networks (LAN/VPN), not the public internet.
  • If you’re calling MCP from a browser-based app, you may need to set MCP_ALLOWED_ORIGINS in .env.

The MCP server provides tools for capture, search, and structured context:

  • capture_memory — store a memory
  • search_memories — semantic search
  • get_structured_memory — fetch authoritative JSON by key
  • get_context — combine structured memory + semantic search for prompt injection

Option A — Use an HTTP MCP client (recommended)

If your client supports HTTP MCP, point it at:

  • http://localhost:8080/mcp

OpenWebUI notes

Quick clarification: GET /mcp/openapi.json will not list MCP tools. MCP tools are discovered via JSON-RPC tools/list.

If you’re using OpenWebUI and it looks like the Synapse tool is never called:

  1. Make sure OpenWebUI can reach the URL (from another machine, use http://<your-host-ip>:8080/mcp).
  2. In the chat, ensure the Synapse tool is enabled/active for the conversation.
  3. Use a model/config that supports tool calling.
  4. If OpenWebUI is running in a browser, it will send an Origin header. If MCP replies 403 Origin not allowed, add your OpenWebUI URL (exactly as shown in the browser address bar, including http:///https:// and port) to MCP_ALLOWED_ORIGINS in .env, then restart the MCP service.
  5. Verify the MCP server is being invoked by watching logs:
docker compose logs -f --tail=200 mcp

When it’s working, you should see MCP methods like initialize, tools/list, and tools/call.

Add OpenWebUI Config
  1. Goto Admin Panel -> Settings -> Tools
  2. Under General/Manage Tool Servers click Import
  3. Select the tool-server-0.json which is included in this repository

Option B — Claude Desktop (stdio)

Claude Desktop uses MCP over a local command. This repo ships a stdio entrypoint inside the MCP Docker image.

Example claude_desktop_config.json snippet:

{
  "mcpServers": {
    "synapse": {
      "command": "docker",
      "args": [
        "run",
        "--rm",
        "-i",
        "-e",
        "API_BASE_URL=http://host.docker.internal:8000",
        "synapse-mcp",
        "python",
        "/app/stdio_server.py"
      ]
    }
  }
}

Notes:

  • Ensure the stack is running (docker compose up -d --build).
  • On Linux, replace host.docker.internal with your host IP/hostname.

Troubleshooting

API says Ollama is unavailable

  • Confirm Ollama is running.
  • Confirm the API container can reach it:
    • macOS/Windows: OLLAMA_HOST=host.docker.internal
    • remote: set OLLAMA_HOST=<ip/hostname>

Element X can’t connect

  • Prefer HTTP for local-only: http://<host>:8008
  • Make sure you used Sign in, and created the user with register_new_matrix_user.

Matrix bot is “idling”

This means it isn’t configured yet. Set:

  • MATRIX_USER_ID
  • MATRIX_ACCESS_TOKEN
  • MATRIX_ROOM_ID

Then restart the bot:

docker compose --profile matrix up -d matrix-bot

License

MIT (see LICENSE).

About

🧠 Local-first persistent AI memory: capture thoughts, generate embeddings with Ollama, store in Postgres (pgvector), and retrieve via semantic search.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors