You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Due to the nature of windowed attention, our model at 224x224 can only supports multipliers of 224 sizes, eg, 224/448/672/896... You'd have to manually pad the shape to satisfy the requirements. Please see our colab demos on how to inference at higher resolution.
Hi good job, could you tell me how to fit your code for input size, such as 128128, 192192, 256*256 ? thanks.
The text was updated successfully, but these errors were encountered: