Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update README.md #379

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
80 changes: 53 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,39 @@
# Grok-1

This repository contains JAX example code for loading and running the Grok-1 open-weights model.
This repository contains JAX example code for loading and running the **Grok-1** open-weights model, developed by xAI, founded by Elon Musk. Grok-1 is designed to tackle a variety of natural language processing tasks effectively. This document will guide you through the setup, usage, and specifications of the model.

Make sure to download the checkpoint and place the `ckpt-0` directory in `checkpoints` - see [Downloading the weights](#downloading-the-weights)
## Table of Contents

Then, run
- [Overview](#overview)
- [Getting Started](#getting-started)
- [Model Specifications](#model-specifications)
- [Downloading Weights](#downloading-weights)
- [Usage](#usage)
- [License](#license)

```shell
pip install -r requirements.txt
python run.py
```
## Overview

Grok-1 is an advanced AI model characterized by its large parameter count and a unique architectural approach utilizing a Mixture of Experts (MoE) framework. This model not only serves as a powerful tool for NLP applications but also provides an exciting opportunity for developers and researchers to explore cutting-edge AI technologies.

to test the code.
## Getting Started

The script loads the checkpoint and samples from the model on a test input.
To set up and run Grok-1, follow these steps:

Due to the large size of the model (314B parameters), a machine with enough GPU memory is required to test the model with the example code.
The implementation of the MoE layer in this repository is not efficient. The implementation was chosen to avoid the need for custom kernels to validate the correctness of the model.
1. **Clone the repository:**
```shell
git clone https://github.com/xai-org/grok-1.git
cd grok-1
```

# Model Specifications
2. **Install required dependencies:**
```shell
pip install -r requirements.txt
```

3. **Download the model weights:**
Ensure that you download the checkpoint and place the `ckpt-0` directory in `checkpoints` (see [Downloading Weights](#downloading-weights)).

## Model Specifications

Grok-1 is currently designed with the following specifications:

Expand All @@ -29,28 +44,39 @@ Grok-1 is currently designed with the following specifications:
- **Attention Heads:** 48 for queries, 8 for keys/values
- **Embedding Size:** 6,144
- **Tokenization:** SentencePiece tokenizer with 131,072 tokens
- **Maximum Sequence Length (context):** 8,192 tokens
- **Additional Features:**
- Rotary embeddings (RoPE)
- Supports activation sharding and 8-bit quantization
- **Maximum Sequence Length (context):** 8,192 tokens

# Downloading the weights
## Downloading Weights

You can download the weights using a torrent client and this magnet link:
You can download the weights using two methods:

```
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```
1. **Using a Torrent Client:**
Download the weights using the following magnet link:
```
magnet:?xt=urn:btih:5f96d43576e3d386c9ba65b883210a393b68210e&tr=https%3A%2F%2Facademictorrents.com%2Fannounce.php&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce
```

or directly using [HuggingFace 🤗 Hub](https://huggingface.co/xai-org/grok-1):
```
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
2. **Directly from Hugging Face Hub:**
Clone the repository and use the following commands:
```shell
git clone https://github.com/xai-org/grok-1.git && cd grok-1
pip install huggingface_hub[hf_transfer]
huggingface-cli download xai-org/grok-1 --repo-type model --include ckpt-0/* --local-dir checkpoints --local-dir-use-symlinks False
```

## Usage

To test the code, run the following command:
```shell
python run.py
```
This script loads the checkpoint and samples from the model on a test input.

**Note:** Due to the large size of the model (314B parameters), a machine with sufficient GPU memory is required to test the model with the example code. The current implementation of the MoE layer may not be fully optimized; it was chosen to facilitate correctness validation without the need for custom kernels.

# License
## License

The code and associated Grok-1 weights in this release are licensed under the
Apache 2.0 license. The license only applies to the source files in this
repository and the model weights of Grok-1.
The code and associated Grok-1 weights in this release are licensed under the Apache 2.0 license. This license only applies to the source files in this repository and the model weights of Grok-1.