Skip to content

pdurlej/KOS2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

14 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

KOS2

KOS2 hero

KOS2 is a Knowledge Operating System for Obsidian. Open, local-first, built for excellence in boring work.

It helps you move from note entropy to useful work:

  • organise messy intake into stabilised draft artifacts with traceability
  • extract real next steps from projects and areas
  • draft decisions from evidence
  • review outcomes and close loops

If you want the mental model behind the product, read KOS Philosophy.

Why KOS2

Most AI note tools are good at answering questions and bad at helping you run an operating system for your own work.

KOS2 keeps the loop simple:

  • Ollama-first for local chat and local embeddings
  • fully local if you want your note work, embeddings, and decisions to stay on your machine
  • Privacy (local) Mode when you want the valuable parts to stay on the local path by default
  • workflow paths for Organise, Next steps, Decision, and Review
  • semantic search for your vault when you choose to enable it
  • optional Ollama Cloud only for web search and web fetch flows

Fast Start

  1. Install KOS2 from source and enable it in Obsidian.
  2. Start local Ollama.
  3. Pull one chat model and one embedding model.
  4. Open Settings -> KOS2 -> Setup.
  5. Turn on Privacy (local) Mode if you want the default path to stay local.
  6. Open Knowledge, sync models, and choose a local embedding model.
  7. Use the KOS starter paths: Organise, Next steps, Decision, Review.

What KOS2 Does Today

  • local Ollama model discovery and sync
  • local chat with verified Ollama models
  • local embedding path for vault search
  • KOS starter surface in the chat UI
  • first-pass workflow commands for organise, next-steps, decision, and review
  • stronger intake: organise extracts intake signals, ranks stable routes, and previews a stabilised draft artifact before any write
  • optional cloud web tooling through Ollama Cloud
  • transcript setup guidance for Supadata and local tooling preparation

Install

Right now the supported install path is from source.

1. Clone the repo

git clone https://github.com/pdurlej/KOS2.git
cd KOS2

2. Build the plugin

npm install
npm run build

3. Copy the plugin into your Obsidian vault

Replace /path/to/YourVault with your vault path:

mkdir -p "/path/to/YourVault/.obsidian/plugins/kos2"
cp main.js manifest.json styles.css "/path/to/YourVault/.obsidian/plugins/kos2/"

4. Enable it in Obsidian

  1. Open Settings -> Community plugins
  2. Turn off Restricted mode if needed
  3. Reload plugins or restart Obsidian
  4. Enable KOS2

First Run

KOS2 is best when you start with local Ollama.

Install and start Ollama

See ollama.com for the official installer.

If you use Homebrew on macOS:

brew install --cask ollama
open -a Ollama

Pull at least one chat model

Pick one to start:

ollama pull qwen3:8b

or:

ollama pull gemma3:12b

Pull one embedding model

ollama pull bge-m3

Then in KOS2

  1. Open Settings -> KOS2 -> Setup
  2. Confirm local Ollama is reachable
  3. Turn on Privacy (local) Mode if you want KOS2 to stay on the local path by default
  4. Use KOS2 Local Agent or pick a specific local chat model
  5. Open Knowledge and sync models
  6. Choose a local embedding model if you want semantic search

Recommended Local Models

These are practical starting points, not hard requirements:

  • Fast: smaller Qwen or Gemma models
  • Balanced: qwen3:8b
  • Best local quality: larger Gemma or Qwen models if your machine can handle them
  • Embeddings: bge-m3

KOS2 also surfaces recommendations inside the plugin based on what is actually installed locally and what your machine looks capable of running.

Privacy Model

KOS2 separates two paths clearly:

  • Ollama Local: chat, embeddings, and vault work that can stay on your machine
  • Ollama Cloud: optional helper path for web search and web fetch

If you care about keeping your most valuable context local, use:

  • Privacy (local) Mode
  • KOS2 Local Agent
  • local embeddings in Knowledge

If you want, KOS2 can run fully local:

  • local Ollama for chat
  • local Ollama for embeddings
  • no Ollama Cloud key configured
  • no transcript API configured

Workflow Paths

KOS2 is most useful when you use it as an operator for a note workflow instead of a generic chat box.

Current paths:

  • Organise: inspect intake, rank stable routes, and preview a cleaner artifact draft
  • Next steps: extract pending work from a project, area, or note
  • Decision: draft a decision from evidence and analysis
  • Review: capture what happened, what changed, and what should happen next

Intake Maturity

KOS2 now does more than safe routing.

Today Organise can:

  • inspect tasks, bullets, and selected excerpts from the current note
  • recognize inbox, project, area, resource, analysis, decision, review, and outcome material
  • rank candidate routes for raw intake
  • preview a stabilised draft artifact with traceability
  • avoid silent writes while still giving you something concrete to promote

What it does not claim yet is full parity with the stronger intake contract from the separate kos repo, where the north star is:

drop material -> organise -> stable work artifact

KOS2 is moving in that direction, but the current plugin is still intentionally more conservative.

Optional Cloud And Transcripts

KOS2 can use Ollama Cloud for web search and web fetch. This is optional.

For YouTube transcripts, the current setup path is:

  • Supadata for transcript API access
  • or local preparation with yt-dlp and whisper

Transcript UX is present in the plugin, but this is still an evolving capability rather than a fully finished media pipeline.

Local Smoke Check

npm run smoke:ollama

This checks:

  • local Ollama model discovery
  • local chat inference
  • local embeddings
  • Ollama Cloud web search

The cloud key is read in this order:

  1. plugin setting
  2. OLLAMA_API_KEY
  3. macOS Keychain item cos2-ollama-cloud

Planning And Design Docs

License And Attribution

KOS2 remains licensed under AGPL-3.0.

This project starts from logancyang/obsidian-copilot and keeps its AGPL obligations. KOS2 is a soft fork with a different product direction, not a claim that the upstream project authored this roadmap.

About

πŸ¦β€β¬› Knowledge Operating System for Obsidian. Open, local-first, built for excellence in boring work.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages