Skip to content

kimathi-phil/TinyGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TinyGPT

A lightweight AI text generator based on distilgpt2. You can generate text using a pre-trained model and fine-tune it on your own dataset.


Installation

Clone the Repository

git clone https://github.com/yourusername/TinyGPT.git
cd TinyGPT

Create and Activate a Virtual Environment

python3 -m venv ai_env
source ai_env/bin/activate  # on mac/linux
ai_env\Scripts\activate    # on windows powershell

Install Dependencies

pip install -r requirements.txt

Usage

Generating Text

Run TinyGPT with a prompt:

python src/tinygpt.py

Modify the prompt inside src/tinygpt.py to experiment with different outputs.

Fine-Tuning the Model

If you want to fine-tune the model on your own dataset (data/data.txt):

python src/fine_tune.py

This will train the model and save it in models/fine_tuned_model/

Running Fine-Tuned Model

After fine-tuning, update tinygpt.py to load the fine-tuned model:

# change model path to fine-tuned version
generator = pipeline("text-generation", model="../models/fine_tuned_model")

Then, run:

python src/tinygpt.py

Notes

  • The data/data.txt file should contain training examples in plain text format.
  • Model weights are not included here but will be downloaded automatically.

About

A lightweight AI text generator based on distilgpt2

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages