Skip to content

Conversation

@jif-oai
Copy link
Collaborator

@jif-oai jif-oai commented Dec 8, 2025

No description provided.

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@jif-oai
Copy link
Collaborator Author

jif-oai commented Dec 9, 2025

@codex review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +66 to +70
if cfg!(target_os = "windows") {
let mut parser = vt100::Parser::new(40, 120, 1_024);
parser.process(raw.as_bytes());
parser.screen_mut().set_scrollback(usize::MAX);
return parser.screen().contents();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Windows normalization drops output beyond 1024 lines

On Windows we now pass unified-exec output through a fresh vt100::Parser with a scrollback of only 1 024 lines (Parser::new(40, 120, 1_024)) and only set scrollback to usize::MAX after processing. Any command that emits more than 1 024 lines will therefore have all earlier lines discarded before token truncation and before being sent back to the user, whereas other platforms still preserve everything up to UNIFIED_EXEC_OUTPUT_MAX_BYTES/max_output_tokens. This is a regression for long-running commands on Windows because their output is silently chopped to the last ~1 024 lines even when higher limits are configured.

Useful? React with 👍 / 👎.

@jif-oai jif-oai closed this Dec 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants