LocalScore is an open-source benchmarking tool and public database for measuring how fast Large Language Models (LLMs) run on your specific hardware.
Check out localscore.ai to explore benchmark results.
LocalScore is a Mozilla Builders project.
LocalScore helps answer questions like:
- Can my computer run an 8 billion parameter LLM?
- Which GPU should I buy for my local AI setup?
- How does my current hardware stack up against others?
It measures three key performance metrics:
- Prompt Processing Speed: How quickly your system processes input text (tokens per second)
- Generation Speed: How fast your system generates new text (tokens per second)
- Time to First Token: The latency before the response begins to appear (milliseconds)
These metrics are combined into a LocalScore, making it easy to compare different hardware configurations. A score of 1,000 is excellent, 250 is passable, and below 100 will likely be a poor user experience or have significant tradeoffs.
LocalScore leverages Llamafile to ensure portability and acceleration across different systems.
- CPUs (various architectures)
- NVIDIA GPUs
- AMD GPUs
- Apple Silicon (M series)
Currently supports single-GPU setups, which represents the most practical approach for most users running LLMs locally.
LocalScore maintains a public database of benchmark results. Currently, submissions are accepted from:
- The official LocalScore CLI client
We welcome contributions from other clients in the future. If you're developing a client that would like to submit to the LocalScore database, please ensure it conforms to the submission specification defined in src/pages/api/results
. Please reach out to [email protected] for the inclusion of your client.
The submission API expects properly formatted benchmark data including hardware details, model information, and performance metrics. Reviewing the existing implementation will provide the best guidance on the expected format.
This is a Next.js Pages Router application. It uses SQLite (via libSQL) for the database and Drizzle ORM for database interactions. The repo ships with an example SQLite database which can be used for development and testing.
- Bun / Node.js
-
Install Bun (or the Node.js runtime of your choice) if you haven't already:
curl -fsSL https://bun.sh/install | bash
After installation, you may need to add Bun to your PATH. Follow the instructions displayed after installation.
-
Clone the repository:
git clone [email protected]:cjpais/LocalScore.git cd LocalScore
-
Install dependencies:
bun install
-
Start the development server:
bun dev
-
Open your browser and navigate to http://localhost:3000
An example SQLite database is included in the repository, so there's no need to set up a database for local development.
However if you wish to use a remote (libSQL) database. Configure the following .env vars. Currently Turso is used in production, but other libSQL remote databases can be used.
TURSO_DATABASE_URL=
TURSO_AUTH_TOKEN=
Contributions are welcome! Here's how you can help:
- Fork the repository
- Clone your fork:
git clone [email protected]:cjpais/LocalScore.git
- Add the upstream repository:
git remote add upstream [email protected]:cjpais/LocalScore.git
- Create a new branch for your feature:
git checkout -b feature/your-feature-name
- Make a pull request when you're ready
- Code Style: Follow the existing code style and use TypeScript
- Documentation: Update documentation for any changes you make
- Pull Requests: Keep PRs focused on a single feature or bug fix
- Issues: Check existing issues before opening a new one
We are thinking of some features to add in the future:
- API Endpoints: Add public API endpoints for querying the database. If you have ideas for what you would build and what you would want/need, please let us know.
- Multi-GPU Support: Add support for multi-GPU setups.
- Upstreaming to llama.cpp: If the changes are welcome, we would love to upstream the LocalScore CLI client to llama.cpp.
If you have any ideas for features or improvements, please open an issue or submit a pull request.
We would love to hear your feedback! Please open an Issue/Disccusion or reach out to [email protected] with any suggestions, questions, or comments.
LocalScore was created with support from Mozilla Builders as a resource for the AI community. It builds upon the excellent work of llama.cpp and Llamafile.
This project is licensed under the Apache 2.0 License.