Generative Pre-trained Transformer in PyTorch
-
Updated
Jun 29, 2024 - Python
Generative Pre-trained Transformer in PyTorch
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
Training-free Post-training Efficient Sub-quadratic Complexity Attention. Implemented with OpenAI Triton.
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
A compilation of the best multi-agent papers
Julia Implementation of Transformer models
The official PyTorch implementation of the paper "SAITS: Self-Attention-based Imputation for Time Series". A fast and state-of-the-art (SOTA) deep-learning neural network model for efficient time-series imputation (impute multivariate incomplete time series containing NaN missing data/values with machine learning). https://arxiv.org/abs/2202.08516
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
My Attempt(s) In The World Of ML/DL....
Explainable Neural Subgraph Matching with Graph Learnable Multi-hop Attention Networks
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
[CVPR 2024] Official implementation of the paper "Salience DETR: Enhancing Detection Transformer with Hierarchical Salience Filtering Refinement"
[CVPRW2024 FGVC11 (Best paper award)] Official pytorch implementation of the paper: "ConceptHash: Interpretable Fine-Grained Hashing via Concept Discovery"
transformer based chatbot with pytorch
Zeta implemantion of "Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers"
A browser extension that redirects bbc news visits to bbc weather
Attention temporal convolutional network for EEG-based motor imagery classification
[TGRS 2024] The official repository for Journal Article “FactoFormer: Factorized Hyperspectral Transformers with Self-Supervised Pre-Training”, Accepted to IEEE Transactions on Geoscience and Remote Sensing, December 2023.
Own version of the original transformer
Scenic: A Jax Library for Computer Vision Research and Beyond
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."