You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Oct 13, 2022. It is now read-only.
Just throwing the idea around in case somebody wants to pick it up, I wanted to do it for some time but can't find the time. There's a new position embedding type called RoPE that yield improvements across different NLP tasks and now also ASR. Shouldn't be too difficult to add, and there is an implementation here: https://github.com/lucidrains/rotary-embedding-torch
Just throwing the idea around in case somebody wants to pick it up, I wanted to do it for some time but can't find the time. There's a new position embedding type called RoPE that yield improvements across different NLP tasks and now also ASR. Shouldn't be too difficult to add, and there is an implementation here: https://github.com/lucidrains/rotary-embedding-torch
Original paper: https://arxiv.org/abs/2104.09864
Application in ASR: https://arxiv.org/abs/2107.05907
The text was updated successfully, but these errors were encountered: