Skip to content

POC for state inpsection tools#10936

Draft
sjrl wants to merge 4 commits intomainfrom
agent-state-access
Draft

POC for state inpsection tools#10936
sjrl wants to merge 4 commits intomainfrom
agent-state-access

Conversation

@sjrl
Copy link
Contributor

@sjrl sjrl commented Mar 26, 2026

Related Issues

Proposed Changes:

Example script

"""
POC: Agent State Access - skills-as-state pattern

The agent has two skills stored as strings in state:
  - summarize_skill: a prompt telling the agent how to summarise text
  - translate_skill: a prompt telling the agent how to translate text

Rather than loading both into the system prompt upfront, the agent discovers available skills via ls_state, loads the one it needs via read_state, and writes its final output to a `final_answer` key via write_state.
"""

from haystack.components.agents import Agent
from haystack.components.agents.state import LsStateTool, ReadStateTool, WriteStateTool
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.generators.utils import print_streaming_chunk
from haystack.dataclasses import ChatMessage

agent = Agent(
    chat_generator=OpenAIChatGenerator(model="gpt-5.4"),
    tools=[LsStateTool(), ReadStateTool(), WriteStateTool()],
    state_schema={"summarize_skill": {"type": str}, "translate_skill": {"type": str}, "final_answer": {"type": str}},
    system_prompt="""You are a helpful assistant.
Use ls_state to discover available state keys, read_state to read their values, and write_state to record your final
response in the `final_answer` key.

If you see a key that ends in `_skill`, it contains instructions for how to perform a specific task.
Use these instructions to guide your actions.""",
)

result = agent.run(
    messages=[
        ChatMessage.from_user(
            """Please summarise the following text:

Haystack is an open-source AI orchestration framework that you can use to build powerful, production-ready applications with Large Language Models (LLMs) for various use cases. Whether you’re creating autonomous agents, multimodal apps, or scalable RAG systems, Haystack provides the tools to move from idea to production easily.
Haystack is designed in a modular way, allowing you to combine the best technology from OpenAI, Google, Anthropic, and open-source projects like Hugging Face's Transformers.
The core foundation of Haystack consists of components and pipelines, along with Document Stores, Agents, Tools, and many integrations. Read more about Haystack concepts in the Haystack Concepts Overview.
Supported by an engaged community of developers, Haystack has grown into a comprehensive and user-friendly framework for LLM-based development.
"""  # noqa: E501
        )
    ],
    summarize_skill=(
        "To summarise text: identify the main topic, strip filler words, and return a single concise sentence."
    ),
    translate_skill=(
        "To translate text: preserve meaning and tone exactly, and return only the translated text without commentary."
    ),
    streaming_callback=print_streaming_chunk,
)

print("Final answer:", result["final_answer"])

Example output

[TOOL CALL]
Tool: ls_state
Arguments: {}

[TOOL RESULT]

  • summarize_skill (str)
  • translate_skill (str)
  • final_answer (str)
  • messages (list)

[TOOL CALL]
Tool: read_state
Arguments: {"key": "summarize_skill", "truncate": false}

[TOOL RESULT]
'To summarise text: identify the main topic, strip filler words, and return a single concise sentence.'

[TOOL CALL]
Tool: write_state
Arguments: {"key":"final_answer","value":"Haystack is an open-source, modular framework for building production-ready LLM applications—such as agents, multimodal apps, and RAG systems—using components, pipelines, tools, document stores, and integrations across leading AI providers and open-source technologies."}

[TOOL RESULT]
State key 'final_answer' updated successfully.

[ASSISTANT]
Haystack is an open-source, modular framework for building production-ready LLM applications—such as agents, multimodal apps, and RAG systems—using components, pipelines, tools, document stores, and integrations across leading AI providers and open-source technologies.

Final answer: Haystack is an open-source, modular framework for building production-ready LLM applications—such as agents, multimodal apps, and RAG systems—using components, pipelines, tools, document stores, and integrations across leading AI providers and open-source technologies.

How did you test it?

Notes for the reviewer

Checklist

  • I have read the contributors guidelines and the code of conduct.
  • I have updated the related issue with new insights and changes.
  • I have added unit tests and updated the docstrings.
  • I've used one of the conventional commit types for my PR title: fix:, feat:, build:, chore:, ci:, docs:, style:, refactor:, perf:, test: and added ! in case the PR includes breaking changes.
  • I have documented my code.
  • I have added a release note file, following the contributors guidelines.
  • I have run pre-commit hooks and fixed any issue.

@vercel
Copy link

vercel bot commented Mar 26, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
haystack-docs Ignored Ignored Preview Mar 26, 2026 1:04pm

Request Review

@github-actions github-actions bot added topic:tests type:documentation Improvements on the docs labels Mar 26, 2026
Comment on lines +424 to +428
# Inject the live State object for any parameter annotated as State
for param_name, param_type in ToolInvoker._get_func_params(tool).items():
if param_type is State:
final_args[param_name] = state

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@anakin87 the additional code to skip State in the function params isn't too much but I realize now with the pre-built tools we don't even need it since I directly create their parameter schemas.

Comment on lines +145 to +147
# Skip State-typed parameters — ToolInvoker injects them at runtime, hidden from the LLM
if param.annotation is State:
continue
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sjrl sjrl requested a review from anakin87 March 26, 2026 13:06
@anakin87
Copy link
Member

Thank you for working on this!

The POC looks nice and clean.

Would it be worth getting a second opinion?

Quick point: I'm not sure if we want to explicitly refer to skills here, unless we have broader support for them, like also reading from skills files, etc. WDYT?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

topic:tests type:documentation Improvements on the docs

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Consider sending whole State object to a Tool if it's in its function signature

2 participants