Skip to content

YUALAB/yua-desktop

Repository files navigation

Yua Desktop - AI Chat Assistant

Yua Logo

ローカルLLM対応のElectronベースAIチャットアシスタント

Version TypeScript Electron React License

Features

Implemented

  • Chat with local LLMs via Ollama
  • Model switching (Yua / Yua Pro)
  • Session management
  • Dark theme UI
  • Streaming responses
  • Message history with SQLite

Not Yet Implemented (UI placeholders)

  • Video generation
  • Agent functionality
  • File/Image upload

These features are displayed in the UI but not yet functional. Contributions welcome! Feel free to implement and submit a PR.

Requirements

  • Node.js 18.x+
  • Ollama running locally (http://localhost:11434)

Ollama Setup

# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Start Ollama
ollama serve

# Pull models
ollama pull gpt-oss:20b-cloud   # Yua (lightweight)
ollama pull gpt-oss:120b-cloud  # Yua Pro (powerful)

Quick Start

# Clone
git clone https://github.com/YUALAB/yua-desktop.git
cd yua-desktop

# Install dependencies
npm install

# Start development
npm run dev

Build

# Build for current platform
npm run build

# Platform-specific builds
npm run dist:win    # Windows
npm run dist:mac    # macOS
npm run dist:linux  # Linux

Tech Stack

Category Technology
Framework Electron 28
UI React 18 + TypeScript
Styling Tailwind CSS
State Zustand
Animation Framer Motion
Database SQLite (better-sqlite3)
Validation Zod
Build Vite

Architecture

src/
├── main/           # Electron main process
│   ├── managers/   # DI container & managers
│   ├── features/   # Feature modules
│   └── handlers/   # IPC handlers
├── renderer/       # React application
│   ├── components/ # UI components
│   ├── store/      # Zustand stores
│   ├── services/   # API services
│   └── business/   # Business logic
├── domain/         # Domain models & services
├── infrastructure/ # Database implementation
└── preload/        # Electron preload scripts

Configuration

The app connects to Ollama at http://localhost:11434 by default.

To use a different endpoint, set the environment variable:

OLLAMA_URL=http://your-ollama-server:11434 npm run dev

Contributing

Contributions are welcome! This project has a clean architecture that makes it easy to extend.

Areas for Contribution

  • Implement file upload functionality
  • Add agent/tool capabilities
  • Video generation integration
  • Additional model providers
  • Internationalization (i18n)

Development

# Fork and clone
git clone https://github.com/YOUR_USERNAME/yua-desktop.git

# Create feature branch
git checkout -b feature/your-feature

# Run tests
npm test

# Submit PR

License

MIT License - see LICENSE for details.

About

Developed by YUA LAB (AQUA LLC)

About

Ollama-based AI chat assistant - Electron + React + TypeScript

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published