Skip to content

[fix] enable native OpenAI-style tool calling for local model providers#1

Merged
h2balloon merged 1 commit intomainfrom
fix/local-native-tool-calling
Apr 30, 2026
Merged

[fix] enable native OpenAI-style tool calling for local model providers#1
h2balloon merged 1 commit intomainfrom
fix/local-native-tool-calling

Conversation

@h2balloon
Copy link
Copy Markdown
Owner

lol @ voideditor/void#966

why is this a pr, idk

Cherry-picked from voideditor/void#964 by octo-patch.

Adds specialToolFormat: 'openai-style' to OSS model capability entries so local providers (Ollama, vLLM, LM Studio, openAICompatible, liteLLM) send the native tools array instead of falling back to XML prompting. Fixes the "No Search/Replace blocks were received" class of edit/apply failures on qwen-coder, llama3.1+, devstral, codestral, and friends.

Cherry-picked from voideditor/void#964 by octo-patch.

Adds specialToolFormat: 'openai-style' to OSS model capability entries
so local providers (Ollama, vLLM, LM Studio, openAICompatible, liteLLM)
send the native tools array instead of falling back to XML prompting.
Fixes the "No Search/Replace blocks were received" class of edit/apply
failures on qwen-coder, llama3.1+, devstral, codestral, and friends.

Co-Authored-By: octo-patch <noreply@github.com>
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@h2balloon h2balloon merged this pull request into main Apr 30, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant