fix(agentflow): preserve raw tool_calls metadata for OpenAI-compatible providers#6331
Conversation
There was a problem hiding this comment.
Code Review
This pull request introduces logic to preserve raw OpenAI-compatible tool_calls within the additional_kwargs of LLM responses, ensuring that provider-specific fields like thought_signature are maintained for follow-up turns. Feedback was provided regarding code duplication, as the same logic was implemented in two separate locations; refactoring this into a private helper method was suggested to improve maintainability.
| // Ensure raw OpenAI-compatible tool_calls are preserved for providers | ||
| // that require provider-specific fields (e.g. thought_signature) on follow-up turns. | ||
| if (response.tool_calls?.length) { | ||
| const existingRawToolCalls = (response.additional_kwargs as any)?.tool_calls | ||
| const mergedRawToolCalls = response.tool_calls.map((toolCall: any) => { | ||
| const existing = Array.isArray(existingRawToolCalls) | ||
| ? existingRawToolCalls.find((raw: any) => raw?.id && toolCall?.id && raw.id === toolCall.id) | ||
| : undefined | ||
| if (existing) return existing | ||
| return { | ||
| id: toolCall.id, | ||
| type: 'function', | ||
| function: { | ||
| name: toolCall.name, | ||
| arguments: JSON.stringify(toolCall.args ?? {}), | ||
| ...(toolCall?.thought_signature ? { thought_signature: toolCall.thought_signature } : {}), | ||
| ...(toolCall?.thoughtSignature ? { thought_signature: toolCall.thoughtSignature } : {}) | ||
| } | ||
| } | ||
| }) | ||
| response.additional_kwargs = { | ||
| ...(response.additional_kwargs ?? {}), | ||
| tool_calls: mergedRawToolCalls | ||
| } | ||
| } |
There was a problem hiding this comment.
The logic for preserving and rebuilding raw tool_calls metadata is duplicated here and in handleResumedToolCalls (lines 2637-2661). This repetition makes the code harder to maintain and increases the risk of bugs if the logic needs to be updated in the future.
Consider refactoring this into a private helper method within the Agent_Agentflow class to centralize the logic for OpenAI-compatible tool call preservation.
Summary
Preserve raw OpenAI-compatible ool_calls metadata on assistant tool-call turns before re-invoking the model in AgentFlow.
This prevents provider-specific fields (for example hought_signature) from being dropped between tool-call round trips.
What changed
Why
Some OpenAI-compatible providers (notably Gemini-compatible gateways) require extra metadata in tool-call payloads for follow-up requests after tool execution. If this metadata is lost, the second request can fail.
Scope
Closes #6275