Skip to content

fix(agentflow): preserve raw tool_calls metadata for OpenAI-compatible providers#6331

Open
fengfeng-zi wants to merge 1 commit intoFlowiseAI:mainfrom
fengfeng-zi:bugfix/openai-custom-model-thought-signature-6275
Open

fix(agentflow): preserve raw tool_calls metadata for OpenAI-compatible providers#6331
fengfeng-zi wants to merge 1 commit intoFlowiseAI:mainfrom
fengfeng-zi:bugfix/openai-custom-model-thought-signature-6275

Conversation

@fengfeng-zi
Copy link
Copy Markdown

Summary

Preserve raw OpenAI-compatible ool_calls metadata on assistant tool-call turns before re-invoking the model in AgentFlow.

This prevents provider-specific fields (for example hought_signature) from being dropped between tool-call round trips.

What changed

  • In both AgentFlow tool-call handling paths, before pushing the assistant tool-call message back into messages, rebuild �dditional_kwargs.tool_calls by:
    • reusing existing raw tool call entries when available
    • otherwise synthesizing OpenAI-compatible raw entries from parsed ool_calls
  • Preserve provider extensions such as hought_signature/ houghtSignature when present.

Why

Some OpenAI-compatible providers (notably Gemini-compatible gateways) require extra metadata in tool-call payloads for follow-up requests after tool execution. If this metadata is lost, the second request can fail.

Scope

  • No behavior change for providers that do not need extra tool-call metadata.
  • No changes to tool execution logic.

Closes #6275

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces logic to preserve raw OpenAI-compatible tool_calls within the additional_kwargs of LLM responses, ensuring that provider-specific fields like thought_signature are maintained for follow-up turns. Feedback was provided regarding code duplication, as the same logic was implemented in two separate locations; refactoring this into a private helper method was suggested to improve maintainability.

Comment on lines +2230 to +2254
// Ensure raw OpenAI-compatible tool_calls are preserved for providers
// that require provider-specific fields (e.g. thought_signature) on follow-up turns.
if (response.tool_calls?.length) {
const existingRawToolCalls = (response.additional_kwargs as any)?.tool_calls
const mergedRawToolCalls = response.tool_calls.map((toolCall: any) => {
const existing = Array.isArray(existingRawToolCalls)
? existingRawToolCalls.find((raw: any) => raw?.id && toolCall?.id && raw.id === toolCall.id)
: undefined
if (existing) return existing
return {
id: toolCall.id,
type: 'function',
function: {
name: toolCall.name,
arguments: JSON.stringify(toolCall.args ?? {}),
...(toolCall?.thought_signature ? { thought_signature: toolCall.thought_signature } : {}),
...(toolCall?.thoughtSignature ? { thought_signature: toolCall.thoughtSignature } : {})
}
}
})
response.additional_kwargs = {
...(response.additional_kwargs ?? {}),
tool_calls: mergedRawToolCalls
}
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The logic for preserving and rebuilding raw tool_calls metadata is duplicated here and in handleResumedToolCalls (lines 2637-2661). This repetition makes the code harder to maintain and increases the risk of bugs if the logic needs to be updated in the future.

Consider refactoring this into a private helper method within the Agent_Agentflow class to centralize the logic for OpenAI-compatible tool call preservation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OpenAI Custom Model node fails with Gemini/OpenAI-compatible providers due to thought_signature tool-call incompatibility

1 participant