Skip to content

Python: [Bug]: Content._add_text_reasoning_content drops id during coalescing, breaking encrypted reasoning round-trip #4852

@lukasz-cledar

Description

@lukasz-cledar

Description

What happened?
When using OpenAIResponsesClient in streaming mode with reasoning enabled, the id field (the rs_* identifier) is lost from text_reasoning Content objects after finalize_response coalesces them. The coalesced Content has id=None even though all input Content objects had the correct rs* ID.

This breaks the reasoning.encrypted_content round-trip: without the id, the serializer (_prepare_content_for_openai) cannot include it in the reasoning input item, and the API cannot match the encrypted blob to cached reasoning tokens. The encrypted content becomes unusable.

What did you expect to happen?
The id should be preserved through coalescing. The merged text_reasoning Content should retain the rs_* ID from the input Contents so that _prepare_content_for_openai (line ~1099) can include it in the serialized reasoning item alongside encrypted_content.

Steps to reproduce issue:

  1. Create an OpenAIResponsesClient and an Agent with InMemoryHistoryProvider
  2. Configure options with store=False, reasoning={"effort": "low", "summary": "auto"}, and include=["reasoning.encrypted_content"]
  3. Send a prompt to a reasoning model (e.g. gpt-5) using the streaming path (agent.run(message, stream=True))
  4. After the stream completes, call stream.get_final_response()
  5. Inspect text_reasoning Content objects in the response messages
  6. Observe that content.id is None, despite additional_properties["encrypted_content"] being correctly populated

Code Sample

from agent_framework import Agent, AgentSession, InMemoryHistoryProvider
from agent_framework.openai import OpenAIResponsesClient
client = OpenAIResponsesClient(api_key="...")
agent = Agent(
    client=client,
    name="test",
    instructions="You are a helpful assistant.",
    default_options={
        "model": "gpt-5",
        "store": False,
        "reasoning": {"effort": "low", "summary": "auto"},
        "include": ["reasoning.encrypted_content"],
    },
    context_providers=[InMemoryHistoryProvider()],
)
session = AgentSession()
# Streaming call
stream = agent.run("What is 2+2?", session=session, stream=True)
async for _ in stream:
    pass
response = await stream.get_final_response()
# Check reasoning contents after coalescing
for msg in response.messages:
    for content in msg.contents:
        if content.type == "text_reasoning":
            print(f"id: {content.id!r}")
            print(f"text: {content.text!r}")
            print(f"encrypted_content: {(content.additional_properties or {}).get('encrypted_content', 'MISSING')!r}")
            # Expected: id="rs_..." (preserved from streaming events)
            # Actual:   id=None (lost during coalescing)
            # encrypted_content is correctly present

Error Messages / Stack Traces

No error or exception is raised. The id is silently dropped.

Package Versions

agent-framework-core: 1.0.0rc5, openai: 2.24.0

Python Version

3.13.2

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions