Skip to content

Conversation

oinoom
Copy link
Contributor

@oinoom oinoom commented Sep 19, 2025

When running the refactor tool for removing literal suffixes on a vec![10u32, 20u32, 30u32] for example, the refactor tool would crash because the collapse pipeline code in CollectMacros was expecting structural equivalence in unexpanded/expanded ASTs, but by rewriting the macro contents, we introduced a LazyTokenStream in the expansion.

This treats such a scenario as expected, but still panics for the other non-token-stream causes of structural mismatch.

@ahomescu ahomescu force-pushed the ahomescu/fix_reorganize_definitions branch from 3b6b204 to 0f281b4 Compare September 19, 2025 02:54
@ahomescu
Copy link
Contributor

This fixes the issue? Was a macro invocation also the problem in the transpiled code?

If the problem is LazyTokenStream, the problem might be that we're missing an attribute for the matcher for that node in https://github.com/immunant/c2rust/blob/ahomescu/restore_refactor/c2rust-refactor/gen/ast.txt#L590.

// Regression input for the token-stream mismatch caused by editing vec! literals.

fn main() {
let ints = vec![10u32, 20u32, 30u32];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we adding, or deleting a token stream here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My impression is that, because we're editing the tokens, the cache of tokens is invalidated and rustc generates a new token stream. That mismatch in None token stream and Some(TokenStream) is the structural mismatch that caused a panic before and this changes.

This fixes the issue?

yes

If the problem is LazyTokenStream, the problem might be that we're missing an attribute for the matcher for that node

I'm not sure how to address this question. I think the crash was happening before any token stream comparison happened, i.e. before it recursed into Some(TokenStream). Are you saying there could be some attribute that results in comparing two Some(_) values rather than (None, Some) such that a crash wouldn't have happened?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you saying there could be some attribute that results in comparing two Some(_) values rather than (None, Some) such that a crash wouldn't have happened?

It shouldn't be doing the comparison in the first place, can you figure out where it's happening? We have LazyTokenStream set to ignore

#[equiv_mode=ignore] #[match=ignore] #[rewrite_ignore]
flag LazyTokenStream;

so the two streams should always compare as equal.

@oinoom oinoom requested a review from ahomescu September 19, 2025 14:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants