Skip to content

Conversation

@joshka-oai
Copy link
Collaborator

Background
Streaming assistant prose in tui2 was being rendered with viewport-width wrapping during streaming, then stored in history cells as already split Lines. Those width-derived breaks became indistinguishable from hard newlines, so the transcript could not "un-split" on resize. This also degraded copy/paste, since soft wraps looked like hard breaks.

What changed

  • Introduce width-agnostic MarkdownLogicalLine output in tui2/src/markdown_render.rs, preserving markdown wrap semantics: initial/subsequent indents, per-line style, and a preformatted flag.
  • Update the streaming collector (tui2/src/markdown_stream.rs) to emit logical lines (newline-gated) and remove any captured viewport width.
  • Update streaming orchestration (tui2/src/streaming/*) to queue and emit logical lines, producing AgentMessageCell::new_logical(...).
  • Make AgentMessageCell store logical lines and wrap at render time in HistoryCell::transcript_lines_with_joiners(width), emitting joiners so copy/paste can join soft-wrap continuations correctly.

Overlay deferral
When an overlay is active, defer cells (not rendered Vec<Line>) and render them at overlay close time. This avoids baking width-derived wraps based on a stale width.

Tests + docs

  • Add resize/reflow regression tests + snapshots for streamed agent output.
  • Expand module/API docs for the new logical-line streaming pipeline and clarify joiner semantics.
  • Align scrollback-related docs/comments with current tui2 behavior (main draw loop does not flush queued "history lines" to the terminal).

More details
See codex-rs/tui2/docs/streaming_wrapping_design.md for the full problem statement and solution approach, and
codex-rs/tui2/docs/tui_viewport_and_history.md for viewport vs printed output behavior.

Background
Streaming assistant prose in tui2 was being rendered with viewport-width
wrapping during streaming, then stored in history cells as already split
`Line`s. Those width-derived breaks became indistinguishable from hard
newlines, so the transcript could not "un-split" on resize. This also
degraded copy/paste, since soft wraps looked like hard breaks.

What changed
- Introduce width-agnostic `MarkdownLogicalLine` output in
  `tui2/src/markdown_render.rs`, preserving markdown wrap semantics:
  initial/subsequent indents, per-line style, and a preformatted flag.
- Update the streaming collector (`tui2/src/markdown_stream.rs`) to emit
  logical lines (newline-gated) and remove any captured viewport width.
- Update streaming orchestration (`tui2/src/streaming/*`) to queue and
  emit logical lines, producing `AgentMessageCell::new_logical(...)`.
- Make `AgentMessageCell` store logical lines and wrap at render time in
  `HistoryCell::transcript_lines_with_joiners(width)`, emitting joiners
  so copy/paste can join soft-wrap continuations correctly.

Overlay deferral
When an overlay is active, defer *cells* (not rendered `Vec<Line>`) and
render them at overlay close time. This avoids baking width-derived wraps
based on a stale width.

Tests + docs
- Add resize/reflow regression tests + snapshots for streamed agent
  output.
- Expand module/API docs for the new logical-line streaming pipeline and
  clarify joiner semantics.
- Align scrollback-related docs/comments with current tui2 behavior
  (main draw loop does not flush queued "history lines" to the terminal).

More details
See `codex-rs/tui2/docs/streaming_wrapping_design.md` for the full
problem statement and solution approach, and
`codex-rs/tui2/docs/tui_viewport_and_history.md` for viewport vs printed
output behavior.
@joshka-oai
Copy link
Collaborator Author

Screen.Recording.2026-01-05.at.2.10.48.PM.mov

@joshka-oai joshka-oai merged commit c92dbea into main Jan 6, 2026
26 checks passed
@joshka-oai joshka-oai deleted the joshka/tui2-streaming-wrap-reflow branch January 6, 2026 02:38
@github-actions github-actions bot locked and limited conversation to collaborators Jan 6, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants