Skip to content

Messages sent while final output text is streamed are injected in UI then ignored, rather than properly queued #4446

@will-bogusz

Description

@will-bogusz

What version of Codex is running?

0.42.0

Which model were you using?

gpt-5

What platform is your computer?

Darwin 24.5.0 arm64 arm

What steps can reproduce the bug?

  1. Query model
  2. Wait until final text response is being streamed (rather than mid file reading, reasoning, etc)
  3. Attempt to send message
Image

What is the expected behavior?

Messages should be queued when sent at any state of streaming non-completion

What do you see instead?

Message is injected into the middle of the streamed text in the UI, and seemingly ignored once streaming completes.

Additional information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions