Skip to content

Conversation

@qandrew
Copy link
Contributor

@qandrew qandrew commented Dec 9, 2025

Purpose

This change enables streaming support for MCP tools when using GPT OSS. It extends the harmony utilities and response serving infrastructure to handle tool streaming, allowing tool calls and their results to be incrementally streamed back to clients rather than returned as a single batch.

taken over from #30192, builds on top of #30054

Test Plan

VLLM_GPT_OSS_SYSTEM_TOOL_MCP_LABELS=web_search_preview,container,code_interpreter CUDA_VISIBLE_DEVICES=2,3 with-proxy vllm serve "openai/gpt-oss-120b" -tp 2     --trust-remote-code  --tool-server=localhost:8081/container,localhost:8081/browser,localhost:8081/python
curl -X POST "http://localhost:8000/v1/responses"   -H "Content-Type: application/json"   -H "Authorization: Bearer dummy-api-key"   -d '{
        "model": "openai/gpt-oss-120b",
        "input": "Multiply 64548*15151 using the python tool.",
        "tools": [
          {
            "type": "mcp",
            "server_label": "code_interpreter",
            "headers": {"test": "test"},
            "server_url": "IGNORED"
          }
        ], "stream": true
      }'

Test Result

event: response.created
data: {"response":{"id":"resp_88dd18064b73b490","created_at":1765254912,"incomplete_details":null,"instructions":null,"metadata":null,"model":"openai/gpt-oss-120b","object":"response","output":[],"parallel_tool_calls":true,"temperature":1.0,"tool_choice":"auto","tools":[{"server_label":"code_interpreter","type":"mcp","allowed_tools":null,"authorization":null,"connector_id":null,"headers":{"test":"test"},"require_approval":null,"server_description":null,"server_url":"IGNORED"}],"top_p":1.0,"background":false,"max_output_tokens":130897,"max_tool_calls":null,"previous_response_id":null,"prompt":null,"reasoning":null,"service_tier":"auto","status":"in_progress","text":null,"top_logprobs":null,"truncation":"disabled","usage":null,"user":null,"input_messages":null,"output_messages":null},"sequence_number":0,"type":"response.created"}

event: response.in_progress
data: {"response":{"id":"resp_88dd18064b73b490","created_at":1765254912,"incomplete_details":null,"instructions":null,"metadata":null,"model":"openai/gpt-oss-120b","object":"response","output":[],"parallel_tool_calls":true,"temperature":1.0,"tool_choice":"auto","tools":[{"server_label":"code_interpreter","type":"mcp","allowed_tools":null,"authorization":null,"connector_id":null,"headers":{"test":"test"},"require_approval":null,"server_description":null,"server_url":"IGNORED"}],"top_p":1.0,"background":false,"max_output_tokens":130897,"max_tool_calls":null,"previous_response_id":null,"prompt":null,"reasoning":null,"service_tier":"auto","status":"in_progress","text":null,"top_logprobs":null,"truncation":"disabled","usage":null,"user":null,"input_messages":null,"output_messages":null},"sequence_number":1,"type":"response.in_progress"}

event: response.output_item.added
data: {"item":{"id":"msg_b57ec5160860fae4","summary":[],"type":"reasoning","content":null,"encrypted_content":null,"status":"in_progress"},"output_index":0,"sequence_number":2,"type":"response.output_item.added"}

event: response.reasoning_part.added
data: {"content_index":0,"item_id":"msg_b57ec5160860fae4","output_index":0,"part":{"text":"","type":"reasoning_text"},"sequence_number":3,"type":"response.reasoning_part.added"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":"We","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":4,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" need","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":5,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" to","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":6,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" compute","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":7,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" product","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":8,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":".","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":9,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" Use","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":10,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":" python","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":11,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":0,"delta":".","item_id":"msg_b57ec5160860fae4","output_index":0,"sequence_number":12,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.done
data: {"content_index":0,"item_id":"msg_b57ec5160860fae4","output_index":1,"sequence_number":13,"text":"We need to compute product. Use python.","type":"response.reasoning_text.done"}

event: response.reasoning_part.done
data: {"content_index":0,"item_id":"msg_b57ec5160860fae4","output_index":1,"part":{"text":"We need to compute product. Use python.","type":"reasoning_text"},"sequence_number":14,"type":"response.reasoning_part.done"}

event: response.output_item.done
data: {"item":{"id":"msg_b57ec5160860fae4","summary":[],"type":"reasoning","content":[{"text":"We need to compute product. Use python.","type":"reasoning_text"}],"encrypted_content":null,"status":"completed"},"output_index":1,"sequence_number":15,"type":"response.output_item.done"}

event: response.output_item.added
data: {"item":{"id":"mcp_aca9df8350bbd455","arguments":"","name":"python","server_label":"code_interpreter","type":"mcp_call","approval_request_id":null,"error":null,"output":null,"status":"in_progress"},"output_index":1,"sequence_number":16,"type":"response.output_item.added"}

event: response.mcp_call.in_progress
data: {"item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":17,"type":"response.mcp_call.in_progress"}

event: response.mcp_call_arguments.delta
data: {"delta":"645","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":18,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.delta
data: {"delta":"48","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":19,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.delta
data: {"delta":"*","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":20,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.delta
data: {"delta":"151","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":21,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.delta
data: {"delta":"51","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":22,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.delta
data: {"delta":"\n","item_id":"mcp_aca9df8350bbd455","output_index":1,"sequence_number":23,"type":"response.mcp_call_arguments.delta"}

event: response.mcp_call_arguments.done
data: {"arguments":"64548*15151\n","item_id":"mcp_aca9df8350bbd455","output_index":2,"sequence_number":24,"type":"response.mcp_call_arguments.done","name":"python"}

event: response.mcp_call.completed
data: {"item_id":"mcp_aca9df8350bbd455","output_index":2,"sequence_number":25,"type":"response.mcp_call.completed"}

event: response.output_item.done
data: {"item":{"id":"mcp_aca9df8350bbd455","arguments":"64548*15151\n","name":"python","server_label":"code_interpreter","type":"mcp_call","approval_request_id":null,"error":null,"output":null,"status":"completed"},"output_index":2,"sequence_number":26,"type":"response.output_item.done"}

event: response.output_item.added
data: {"item":{"id":"msg_ac5b93a10ca56637","summary":[],"type":"reasoning","content":null,"encrypted_content":null,"status":"in_progress"},"output_index":2,"sequence_number":27,"type":"response.output_item.added"}

event: response.reasoning_part.added
data: {"content_index":1,"item_id":"msg_ac5b93a10ca56637","output_index":2,"part":{"text":"","type":"reasoning_text"},"sequence_number":28,"type":"response.reasoning_part.added"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":"Result","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":29,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":" is","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":30,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":" ","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":31,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":"977","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":32,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":",","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":33,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":" ...","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":34,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":" let's","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":35,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":" see","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":36,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.delta
data: {"content_index":1,"delta":".","item_id":"msg_ac5b93a10ca56637","output_index":2,"sequence_number":37,"type":"response.reasoning_text.delta"}

event: response.reasoning_text.done
data: {"content_index":1,"item_id":"msg_ac5b93a10ca56637","output_index":3,"sequence_number":38,"text":"Result is 977, ... let's see.","type":"response.reasoning_text.done"}

event: response.reasoning_part.done
data: {"content_index":1,"item_id":"msg_ac5b93a10ca56637","output_index":3,"part":{"text":"Result is 977, ... let's see.","type":"reasoning_text"},"sequence_number":39,"type":"response.reasoning_part.done"}

event: response.output_item.done
data: {"item":{"id":"msg_ac5b93a10ca56637","summary":[],"type":"reasoning","content":[{"text":"Result is 977, ... let's see.","type":"reasoning_text"}],"encrypted_content":null,"status":"completed"},"output_index":3,"sequence_number":40,"type":"response.output_item.done"}

event: response.reasoning_text.done
data: {"content_index":1,"item_id":"msg_ac5b93a10ca56637","output_index":4,"sequence_number":41,"text":"","type":"response.reasoning_text.done"}

event: response.reasoning_part.done
data: {"content_index":1,"item_id":"msg_ac5b93a10ca56637","output_index":4,"part":{"text":"","type":"reasoning_text"},"sequence_number":42,"type":"response.reasoning_part.done"}

event: response.output_item.done
data: {"item":{"id":"msg_ac5b93a10ca56637","summary":[],"type":"reasoning","content":[{"text":"","type":"reasoning_text"}],"encrypted_content":null,"status":"completed"},"output_index":4,"sequence_number":43,"type":"response.output_item.done"}

event: response.output_item.added
data: {"item":{"id":"msg_852a42f88bd29b6e","content":[],"role":"assistant","status":"in_progress","type":"message"},"output_index":4,"sequence_number":44,"type":"response.output_item.added"}

event: response.content_part.added
data: {"content_index":2,"item_id":"msg_852a42f88bd29b6e","output_index":4,"part":{"annotations":[],"text":"","type":"output_text","logprobs":[]},"sequence_number":45,"type":"response.content_part.added"}

event: response.output_text.delta
data: {"content_index":2,"delta":"The","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":46,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" product","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":47,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" of","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":48,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" \\(","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":49,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"645","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":50,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"48","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":51,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" \\","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":52,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"times","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":53,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" ","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":54,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"151","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":55,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"51","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":56,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"\\","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":57,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":")","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":58,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" is","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":59,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":" **","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":60,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"977","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":61,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":",","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":62,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"204","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":63,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":",","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":64,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"548","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":65,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":"**","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":66,"type":"response.output_text.delta"}

event: response.output_text.delta
data: {"content_index":2,"delta":".","item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":4,"sequence_number":67,"type":"response.output_text.delta"}

event: response.output_text.done
data: {"content_index":2,"item_id":"msg_852a42f88bd29b6e","logprobs":[],"output_index":5,"sequence_number":68,"text":"The product of \\(64548 \\times 15151\\) is **977,204,548**.","type":"response.output_text.done"}

event: response.content_part.done
data: {"content_index":2,"item_id":"msg_852a42f88bd29b6e","output_index":5,"part":{"annotations":[],"text":"The product of \\(64548 \\times 15151\\) is **977,204,548**.","type":"output_text","logprobs":null},"sequence_number":69,"type":"response.content_part.done"}

event: response.output_item.done
data: {"item":{"id":"msg_852a42f88bd29b6e","content":[{"annotations":[],"text":"The product of \\(64548 \\times 15151\\) is **977,204,548**.","type":"output_text","logprobs":null}],"role":"assistant","status":"completed","type":"message"},"output_index":5,"sequence_number":70,"type":"response.output_item.done"}

event: response.completed
data: {"response":{"id":"resp_88dd18064b73b490","created_at":1765254912,"incomplete_details":null,"instructions":null,"metadata":null,"model":"openai/gpt-oss-120b","object":"response","output":[{"id":"rs_ace61c7b30b0b55a","summary":[],"type":"reasoning","content":[{"text":"We need to compute product. Use python.","type":"reasoning_text"}],"encrypted_content":null,"status":null},{"id":"rs_9fef8f5c5e03f239","summary":[],"type":"reasoning","content":[{"text":"64548*15151\n","type":"reasoning_text"}],"encrypted_content":null,"status":null},{"id":"rs_8e07af7206be9171","summary":[],"type":"reasoning","content":[{"text":"64548*15151\n","type":"reasoning_text"}],"encrypted_content":null,"status":null},{"id":"rs_8745bd7272ca28c4","summary":[],"type":"reasoning","content":[{"text":"Result is 977, ... let's see.","type":"reasoning_text"}],"encrypted_content":null,"status":null},{"id":"rs_b7237a3f58bdaf4e","summary":[],"type":"reasoning","content":[{"text":"","type":"reasoning_text"}],"encrypted_content":null,"status":null},{"id":"msg_a403e7e8524fd507","content":[{"annotations":[],"text":"The product of \\(64548 \\times 15151\\) is **977,204,548**.","type":"output_text","logprobs":null}],"role":"assistant","status":"completed","type":"message"}],"parallel_tool_calls":true,"temperature":1.0,"tool_choice":"auto","tools":[{"server_label":"code_interpreter","type":"mcp","allowed_tools":null,"authorization":null,"connector_id":null,"headers":{"test":"test"},"require_approval":null,"server_description":null,"server_url":"IGNORED"}],"top_p":1.0,"background":false,"max_output_tokens":130866,"max_tool_calls":null,"previous_response_id":null,"prompt":null,"reasoning":null,"service_tier":"auto","status":"completed","text":null,"top_logprobs":null,"truncation":"disabled","usage":{"input_tokens":381,"input_tokens_details":{"cached_tokens":352,"input_tokens_per_turn":[175,206],"cached_tokens_per_turn":[160,192]},"output_tokens":76,"output_tokens_details":{"reasoning_tokens":28,"tool_output_tokens":2,"output_tokens_per_turn":[29,47],"tool_output_tokens_per_turn":[0,2]},"total_tokens":457},"user":null,"input_messages":null,"output_messages":null},"sequence_number":71,"type":"response.completed"}

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces support for streaming events related to Multi-Channel Protocol (MCP) tools within the OpenAI Responses API. The changes include adding new MCP-specific event types and modifying the Harmony streaming event processing to correctly handle and differentiate between traditional function calls, legacy code interpreter calls, and the new MCP tool calls. New tests were added to verify the streaming event sequence for MCP tool calls and to ensure multi-turn conversations with MCP tools function as expected. Review comments highlight issues in the new tests, specifically that assertions for event types are too broad or incorrect for MCP events, and question the logic for handling delta events in streaming. Additionally, a change in message processing for multi-turn conversations, which now includes 'analysis' channel messages, requires confirmation of its intended behavioral impact.


stack_of_event_types = []
async for event in stream_response:
assert "mcp_call" in event.type
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

This assertion assert "mcp_call" in event.type is too broad. Not all event types in the stream will contain "mcp_call" (e.g., response.created, response.completed, response.reasoning_text.delta). This assertion will likely cause the test to fail for valid non-MCP events. Consider making this assertion more specific to actual MCP-related event types or removing it if it's not strictly necessary for all events.

Comment on lines +257 to +260
elif event.type.endswith("delta"):
if stack_of_event_types[-1] == event.type:
continue
stack_of_event_types.append(event.type)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The logic for handling delta events might be flawed. If multiple delta events of the same type occur consecutively, only the first one is pushed to the stack, and subsequent ones are skipped (continue). However, pairs_of_event_types maps delta to done events, implying that each done event should correspond to a delta event. If intermediate delta events are skipped, the final done event might pop an incorrect delta event from the stack, leading to an assertion failure or an incorrect len(stack_of_event_types) == 0 at the end. Consider pushing all delta events or refining the stack logic to correctly match done events with their corresponding delta streams.

code_interpreter_events_seen = False

async for event in stream_response:
if "code_interpreter" in event.type:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The assertion assert "code_interpreter" in event.type is incorrect. The event types for MCP tools are like response.mcp_call.in_progress, response.mcp_call_arguments.delta, etc., which do not contain "code_interpreter". This assertion will always be false for the new MCP event types.

Comment on lines +931 to +933
assert not code_interpreter_events_seen, (
"Should not see code_interpreter events when using MCP type"
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Given the previous assertion on line 891, this assertion assert not code_interpreter_events_seen seems contradictory or indicates a misunderstanding of the event types. If the intention is to ensure that legacy code_interpreter events are not seen when using the new mcp type, the check on line 891 should be removed or modified to specifically look for legacy event types, and code_interpreter_events_seen should only be set if a legacy event is encountered. As it stands, the test logic is confusing.

Comment on lines 1055 to +1057
for msg in recent_turn_msgs:
assert isinstance(msg, OpenAIHarmonyMessage)
if msg.channel != "analysis":
prev_msgs.append(msg)
prev_msgs.append(msg)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The removal of the if msg.channel != "analysis": condition means that messages on the "analysis" channel are now included in prev_msgs for subsequent turns. Previously, these messages were filtered out. If "analysis" channel messages are internal reasoning or tool outputs that should not directly influence the model's next turn, this change could lead to unintended behavioral shifts in multi-turn conversations. Please confirm if this is the desired behavior and if including "analysis" messages in the conversational context is appropriate.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Comment on lines +1854 to +1858
if not sent_output_item_added:
sent_output_item_added = True
current_item_id = f"tool_{random_uuid()}"
yield _increment_sequence_number_and_return(
ResponseOutputItemAddedEvent(

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Function tools now emit code-interpreter stream events

Function tool calls now satisfy the broad commentary/analysis branch and fall through to the non-MCP path, which always yields code_interpreter_call delta/progress events even when the recipient namespace is functions.*. Because _is_mcp_tool_by_namespace returns False for functions.*, streaming a normal function call will generate an in-progress code interpreter item before the actual function-call events are emitted, causing clients to see the wrong tool type and duplicated tool call completions for every function invocation.

Useful? React with 👍 / 👎.

@mergify
Copy link

mergify bot commented Dec 9, 2025

Hi @qandrew, the pre-commit checks have failed. Please run:

uv pip install pre-commit
pre-commit install
pre-commit run --all-files

Then, commit the changes and push to your branch.

For future commits, pre-commit will run automatically on changed files before each commit.

Tip

Is mypy or markdownlint failing?
mypy and markdownlint are run differently in CI. If the failure is related to either of these checks, please use the following commands to run them locally:
# For mypy (substitute "3.10" with the failing version if needed)
pre-commit run --hook-stage manual mypy-3.10
# For markdownlint
pre-commit run --hook-stage manual markdownlint

1 similar comment
@mergify
Copy link

mergify bot commented Dec 9, 2025

Hi @qandrew, the pre-commit checks have failed. Please run:

uv pip install pre-commit
pre-commit install
pre-commit run --all-files

Then, commit the changes and push to your branch.

For future commits, pre-commit will run automatically on changed files before each commit.

Tip

Is mypy or markdownlint failing?
mypy and markdownlint are run differently in CI. If the failure is related to either of these checks, please use the following commands to run them locally:
# For mypy (substitute "3.10" with the failing version if needed)
pre-commit run --hook-stage manual mypy-3.10
# For markdownlint
pre-commit run --hook-stage manual markdownlint

Signed-off-by: Andrew Xia <axia@meta.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

frontend gpt-oss Related to GPT-OSS models

Projects

Status: To Triage

Development

Successfully merging this pull request may close these issues.

2 participants