-
Notifications
You must be signed in to change notification settings - Fork 91
feat: Add Anthropic Claude and LM Studio provider support #36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Add Anthropic as LLM provider with full async support - Add LM Studio provider for local model inference - Fix JSON response format compatibility for local models - Update .env.example with configuration examples - Update docstrings with all supported providers Tested with: - Claude Sonnet 4 (claude-sonnet-4-20250514) - Claude Haiku 4.5 (claude-haiku-4-5-20251001) - Qwen 30B via LM Studio
nicoloboschi
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your contribution! I've left 2 small comments
Add configurable timeout support for LLM API calls: - Environment variable override via HINDSIGHT_API_LLM_TIMEOUT - Dynamic heuristic for lmstudio/ollama: 20 mins for large models (30b, 33b, 34b, 65b, 70b, 72b, 8x7b, 8x22b), 5 mins for others - Pass timeout to Anthropic, OpenAI, and local model clients 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Remove CLAUDE.md from .gitignore (should stay in repository) - Pass max_completion_tokens to _call_anthropic instead of hardcoding 4096 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Provides project context and development commands for AI-assisted coding. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Add docker-compose.yml for local development - Add test_internal.py for local testing - Sync uv.lock and llm_wrapper.py changes 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
- Move LLM config to config.py with HINDSIGHT_API_ prefix - Add HINDSIGHT_API_LLM_MAX_CONCURRENT (default: 32) - Add HINDSIGHT_API_LLM_TIMEOUT (default: 120s) - Remove fragile model-size timeout heuristic - Apply markdown JSON extraction to all providers, not just local - Fix Anthropic markdown extraction bug (missing split) - Change LLM request/response logs from info to debug level 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
|
Thanks for the review! I've addressed all the feedback: Changes made: Testing note: Hindsight is working well with qwen/qwen3-vl-8b via LM Studio. |
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
nicoloboschi
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! we're almost there!
- Remove test_internal.py (debug file) - Remove docker-compose.yml (to be moved to hindsight-cookbook repo) 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Summary
Adds support for Anthropic Claude and LM Studio as LLM providers for Hindsight.
Changes
.env.examplewith configuration examples for all providersTested With
claude-sonnet-4-20250514)claude-haiku-4-5-20251001)Example Configuration
Files Changed
hindsight-api/pyproject.toml- Added anthropic dependencyhindsight-api/hindsight_api/engine/llm_wrapper.py- Anthropic + LM Studio implementationhindsight-api/hindsight_api/config.py- LM Studio default URLhindsight/hindsight/server.py- Updated docstrings.env.example- Configuration examples