openai/openai-agents-python

Handoffs with gpt-5* model + store=False + remove_all_tools fails due to 404 error response

Closed this issue · 1 comments

Describe the bug

Reasoning model like gpt-5 or gpt-5-nano cannot use the handoff feature when the store option set to False.
At the handoff-ed agent I got the following error:

"Items are not persisted when store is set to false. Try again with store set to true, or remove this item from your input."

Agent definition:

    triage_agent = Agent[RunContext](
        name="triage_agent",
        instructions=instructions,
        model="gpt-5",
        model_settings=ModelSettings(
            tool_choice="auto",
            parallel_tool_calls=False,
            store=False,
        ),
        handoffs=[
            handoff(image_agent, input_filter=handoff_filters.remove_all_tools),
        ],
        tools=[
            WebSearchTool(**web_search_tool_kwargs),
        ],
    )

    image_agent = Agent[RunContext](
        name="image_agent",
        instructions=image_gen_instructions,
        model=openai_model,
        model_settings=ModelSettings(
            tool_choice="auto",
            parallel_tool_calls=False,
            store=False,
        ),
        handoff_description=image_agent_handoff_prompt,
        tools=image_agent_tools,
    )

Error log:

Traceback (most recent call last):
  File "/xxx/src/common/openai/openai_stream_handler.py", line 105, in stream_events
    async for event in self.result.stream_events():
  File "/xxx/.venv/lib/python3.11/site-packages/agents/result.py", line 215, in stream_events
    raise self._stored_exception
  File "/xxx/.venv/lib/python3.11/site-packages/agents/run.py", line 840, in _start_streaming
    turn_result = await cls._run_single_turn_streamed(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/xxx/.venv/lib/python3.11/site-packages/agents/run.py", line 1009, in _run_single_turn_streamed
    async for event in model.stream_response(
  File "/xxx/.venv/lib/python3.11/site-packages/agents/models/openai_responses.py", line 163, in stream_response
    stream = await self._fetch_response(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/xxx/.venv/lib/python3.11/site-packages/agents/models/openai_responses.py", line 286, in _fetch_response
    return await self._client.responses.create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/xxx/.venv/lib/python3.11/site-packages/openai/resources/responses/responses.py", line 2259, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/xxx/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1794, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/xxx/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1594, in request
    raise self._make_status_error_from_response(err.response) from None
ue, or remove this item from your input.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}b4ad090a7db0de815c6d' not found. Items are not persisted when `store` is set to false. Try again with `store` set to tr..

However, it works without the handoff

    aa = Agent[RunContext](
        name="aa",
        instructions=aaa,
        model=openai_model,
        model_settings=ModelSettings(
            tool_choice="auto",
            parallel_tool_calls=False,
            store=False,
        ),
        # handoffs=[
        #    handoff(image_agent, input_filter=handoff_filters.remove_all_tools),
        # ],
        # tools=[
        #     WebSearchTool(**web_search_tool_kwargs)
        # ],
        tools=[image_generator, image_editor, WebSearchTool(**web_search_tool_kwargs)],
    )

Debug information

  • openai-agents==0.2.11
  • openai==1.106.1
  • python==3.11.10

Repro steps

  • Set store=False for both triage_agent and second_agent

Expected behavior

No error at the second agent.

Thanks for reporting this issue. I confirmed the error occurs with the code example:

import asyncio
import logging

from agents import Agent, ModelSettings, Runner, handoff

logging.basicConfig(level=logging.DEBUG)

async def main():
    agent2 = Agent(
        name="Assistant 2",
        model="gpt-5-mini",
        instructions="You always respond politely.",
        model_settings=ModelSettings(store=False),
    )
    agent = Agent(
        name="Assistant",
        model="gpt-5-mini",
        instructions="You always hand off to the assistant2 agent.",
        model_settings=ModelSettings(store=False),
        handoffs=[handoff(agent2)],
    )

    result = await Runner.run(agent, "Tell me something about San Francisco.")
    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

The error:

DEBUG:openai.agents:Calling LLM gpt-5-mini with input:
[
  {
    "content": "Tell me something about San Francisco.",
    "role": "user"
  },
  {
    "id": "rs_68be61daf64c8194af324972f8d988800d76b88ed0ff4a76",
    "summary": [],
    "type": "reasoning",
    "content": []
  },
  {
    "arguments": "{}",
    "call_id": "call_oT368TCdyn1Q1u80Tsu6O0ik",
    "name": "transfer_to_assistant_2",
    "type": "function_call",
    "id": "fc_68be61dcec8c8194bd30213178d7d1940d76b88ed0ff4a76",
    "status": "completed"
  },
  {
    "call_id": "call_oT368TCdyn1Q1u80Tsu6O0ik",
    "output": "{\"assistant\": \"Assistant 2\"}",
    "type": "function_call_output"
  }
]
Tools:
[]
Stream: False
Tool choice: NOT_GIVEN
Response format: NOT_GIVEN
Previous response id: None
Conversation id: None


.....


ERROR:openai.agents:Error getting response: Error code: 404 - {'error': {'message': "Item with id 'rs_68be61daf64c8194af324972f8d988800d76b88ed0ff4a76' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}. (request_id: req_487f4a66c10a068057b8d24ce6a8d7ab)
DEBUG:openai.agents:Resetting current trace
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/Users/seratch/code/openai-agents-python/examples/test.py", line 29, in <module>
    asyncio.run(main())
  File "/Users/seratch/.pyenv/versions/3.11.8/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/Users/seratch/.pyenv/versions/3.11.8/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/.pyenv/versions/3.11.8/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/examples/test.py", line 24, in main
    result = await Runner.run(agent, "Tell me something about San Francisco.")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/run.py", line 267, in run
    return await runner.run(
           ^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/run.py", line 504, in run
    turn_result = await self._run_single_turn(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/run.py", line 1159, in _run_single_turn
    new_response = await cls._get_new_response(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/run.py", line 1398, in _get_new_response
    new_response = await model.get_response(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/models/openai_responses.py", line 83, in get_response
    response = await self._fetch_response(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/src/agents/models/openai_responses.py", line 286, in _fetch_response
    return await self._client.responses.create(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/.venv/lib/python3.11/site-packages/openai/resources/responses/responses.py", line 2259, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1794, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/seratch/code/openai-agents-python/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1594, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': "Item with id 'rs_68be61daf64c8194af324972f8d988800d76b88ed0ff4a76' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}