Skip to content

Fix ActiveRecord with_tool/with_tools resetting messages mid-conversation#689

Open
ebeigarts wants to merge 2 commits intocrmne:mainfrom
ebeigarts:bugfix/fix-dynamic-tool-loading
Open

Fix ActiveRecord with_tool/with_tools resetting messages mid-conversation#689
ebeigarts wants to merge 2 commits intocrmne:mainfrom
ebeigarts:bugfix/fix-dynamic-tool-loading

Conversation

@ebeigarts
Copy link

When a tool dynamically registers new tools during execution (e.g., a ToolSearch tool that calls chat.with_tool to make discovered tools available), the ActiveRecord with_tool/with_tools methods were calling to_llm which reset all messages from the database. This inserted an extra empty assistant message between the tool_calls message and its tool result, violating the OpenAI API contract and causing a RubyLLM::BadRequestError.

An assistant message with 'tool_calls' must be followed by tool messages responding to each 'tool_call_id'. The following tool_call_ids did not have response messages: call_bcf3b9FOG3G3wxc2OW7Zg3CI (RubyLLM::BadRequestError)

The solution is to add a reset_messages parameter to to_llm so that with_tool/with_tools can initialize the chat object without disrupting the in-flight message sequence.

To reproduce the issue: https://gist.github.com/ebeigarts/dae372091c42756c3bdf1fa9e2d5f2ef

ebeigarts and others added 2 commits March 19, 2026 17:13
Verifies that calling chat.with_tool from within a tool execute method
(the ToolSearch pattern) does not corrupt the message sequence sent to
the provider. Before the fix, to_llm reset messages from the database
mid-conversation, inserting a stray empty assistant message between the
tool_calls and tool result.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
…tion

When a tool dynamically registers new tools during execution (e.g., a
ToolSearch tool that calls chat.with_tool to make discovered tools
available), the ActiveRecord with_tool/with_tools methods were calling
to_llm which reset all messages from the database. This inserted an
extra empty assistant message between the tool_calls message and its
tool result, violating the OpenAI API contract and causing a
BadRequestError.

Add a reset_messages parameter to to_llm so that with_tool/with_tools
can initialize the chat object without disrupting the in-flight message
sequence.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
to_llm.with_tools(...)
to_llm(reset_messages: false).with_tools(...)
self
end
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it would be better to just use this instead?

  def with_tool(...)
    (@chat || to_llm).with_tool(...)
    self
  end

  def with_tools(...)
    (@chat || to_llm).with_tools(...)
    self
  end

@jayelkaake
Copy link
Contributor

Ran into this issue as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants