Skip to content

HelgeSverre/ollama-gui

Repository files navigation

Ollama GUI logo

Ollama GUI

A modern web interface for chatting with your local LLMs through Ollama

Powered by Ollama MIT License Live Demo

✨ Features

  • 🖥️ Clean, modern interface for interacting with Ollama models
  • 💾 Local chat history using IndexedDB
  • 📝 Full Markdown support in messages
  • 🌙 Dark mode support
  • 🚀 Fast and responsive
  • 🔒 Privacy-focused: All processing happens locally

🚀 Quick Start

Prerequisites (only needed for local development)

  1. Install Ollama
  2. Install Node.js (v16+) and Yarn

Local Development

# Start Ollama server with your preferred model
ollama pull mistral  # or any other model
ollama serve

# Clone and run the GUI
git clone https://github.com/HelgeSverre/ollama-gui.git
cd ollama-gui
yarn install
yarn dev

Using the Hosted Version

To use the hosted version, run Ollama with:

OLLAMA_ORIGINS=https://ollama-gui.vercel.app ollama serve

Docker Deployment

No need to install anything other than docker.

If you have GPU, please uncomment the following lines in the file compose.yml

    # deploy:
    #   resources:
    #     reservations:
    #       devices:
    #         - driver: nvidia
    #           count: all
    #           capabilities: [gpu]

Run

docker compose up -d

# Access at http://localhost:8080

Stop

docker compose down

Download more models

# Enter the ollama container
docker exec -it ollama bash

# Inside the container
ollama pull <model_name>

# Example
ollama pull deepseek-r1:7b

Restart the containers using docker compose restart.

Models will get downloaded inside the folder ./ollama_data in the repository. You can change it inside the compose.yml

🛣️ Roadmap

  • Chat history with IndexedDB
  • Markdown message formatting
  • Code cleanup and organization
  • Model library browser and installer
  • Mobile-responsive design
  • File uploads with OCR support

🛠️ Tech Stack

📄 License

Released under the MIT License.

About

A Web Interface for chatting with your local LLMs via the ollama API

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published