PhoGPT: Generative Pre-training for Vietnamese (2023)
-
Updated
Nov 12, 2024 - Python
PhoGPT: Generative Pre-training for Vietnamese (2023)
An autoregressive language model like ChatGPT.
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.
Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
An Industrial Project about NLP in Finance Application
A implimentation of GPT2 varient.
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
Repository for all things Natural Language Processing
PyTorch implementation of GPT from scratch
(GPT-1) | Generative Pre-trained Transformer - 1
I built a GPT model from scratch to generate text
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes
Repository for personal experiments
Add a description, image, and links to the generative-pre-trained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pre-trained-transformer topic, visit your repo's landing page and select "manage topics."