Skip to content

Latest commit

 

History

History
15 lines (11 loc) · 541 Bytes

README.md

File metadata and controls

15 lines (11 loc) · 541 Bytes

Training LLM Model

This repository provides resources and guidelines for training a Large Language Model (LLM) using Phi-2, Phi-3 and Llama 2 frameworks.

Prerequisites

Before starting the training process, ensure you have the following prerequisites installed:

pip3 install -i https://pypi.org/simple/ bitsandbytes
pip3 install peft trl datasets
pip3 install git+https://github.com/huggingface/transformers

Dataset

https://huggingface.co/datasets/databricks/databricks-dolly-15k/blob/main/databricks-dolly-15k.jsonl