Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
-
Updated
Nov 27, 2024 - Python
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
PET-NeuS: Positional Encoding Tri-Planes for Neural Surfaces (CVPR 2023)
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
[CVPR 2021] Adversarial Generation of Continuous Images
[CVPR 2023] This is the official PyTorch implementation for "Dynamic Focus-aware Positional Queries for Semantic Segmentation".
Learnable Fourier Features for Multi-Dimensional Spatial Positional Encoding
Continuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Trading Positional Complexity vs Deepness in Coordinate Networks
Developed the ViViT model for medical video classification, enhancing 3D organ image analysis using transformer-based architectures.
"Found in the Middle: How Language Models Use Long Contexts Better via Plug-and-Play Positional Encoding" Zhenyu Zhang, Runjin Chen, Shiwei Liu, Zhewei Yao, Olatunji Ruwase, Beidi Chen, Xiaoxia Wu, Zhangyang Wang.
Multiresolution Graph Transformers and Wavelet Positional Encoding for Learning Long-Range and Hierarchical Structures
Unofficial pytorch implementation of the paper "Learnable Fourier Features for Multi-Dimensional Spatial Positional Encoding", NeurIPS 2021.
Implementation of Rotary Embeddings, from the Roformer paper, in Tensorflow
Robust Point Cloud Processing through Positional Embedding
PyTorch implementation of "Attention Is All You Need" by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin
Algebraic Positional Encodings
The implementation of transformer as presented in the paper "Attention is all you need" from scratch.
Official code for NeurIPS 2023 paper "Laplacian Canonization: A Minimalist Approach to Sign and Basis Invariant Spectral Embedding".
Benchmarking Positional Encodings for GNNs and Graph Transformers
Code for "The Locality and Symmetry of Positional Encodings" EMNLP Findings
Add a description, image, and links to the positional-encoding topic page so that developers can more easily learn about it.
To associate your repository with the positional-encoding topic, visit your repo's landing page and select "manage topics."