Skip to content

[Fix] enable native OpenAI-style tool calling for local model providers#966

Closed
h2balloon wants to merge 1 commit intovoideditor:mainfrom
h2balloon:fix/local-native-tool-calling
Closed

[Fix] enable native OpenAI-style tool calling for local model providers#966
h2balloon wants to merge 1 commit intovoideditor:mainfrom
h2balloon:fix/local-native-tool-calling

Conversation

@h2balloon
Copy link
Copy Markdown

@h2balloon h2balloon commented Apr 30, 2026

big womp

Cherry-picked from voideditor/void#964 by octo-patch.

Adds specialToolFormat: 'openai-style' to OSS model capability entries
so local providers (Ollama, vLLM, LM Studio, openAICompatible, liteLLM)
send the native tools array instead of falling back to XML prompting.
Fixes the "No Search/Replace blocks were received" class of edit/apply
failures on qwen-coder, llama3.1+, devstral, codestral, and friends.

Co-Authored-By: octo-patch <noreply@github.com>
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant