Skip to content

✨[Feature] Supporting Attention masks when used in VLAs #3880

@narendasan

Description

@narendasan

Is your feature request related to a problem? Please describe.

When supporting the GROOT N1 model, we were not able to handle attn mask and had to use the position ids.

Describe the solution you'd like

Allow users to use attention masks in their models

Describe alternatives you've considered

Additional context

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions