Skip to content
#

generative-pre-trained-transformer

Here are 15 public repositories matching this topic...

A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.

  • Updated Jun 19, 2023
  • Python

Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.

  • Updated Dec 1, 2024
  • Jupyter Notebook

ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.

  • Updated Nov 28, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the generative-pre-trained-transformer topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the generative-pre-trained-transformer topic, visit your repo's landing page and select "manage topics."

Learn more