Skip to content

Feature Request: Language / Localization support (UI + LLM response language) #1063

@Luna2026-a11y

Description

@Luna2026-a11y

Feature Request: Localization & Language Configuration

Summary

Currently, Vane does not provide any option to configure the language used by the UI or the language in which the LLM generates its responses. For users who primarily speak languages other than English (e.g. French, German, Spanish), this creates a friction point since responses are always generated in English regardless of the query language.

Proposed Solution

Add a language preference setting in the settings panel that would:

  1. Set the LLM system prompt language — inject a language instruction into the system prompt so the model always responds in the configured language (e.g. Always respond in French.)
  2. Localize the UI — translate the interface elements (Discover page labels, search modes names, placeholders, etc.) using an i18n framework (e.g. next-intl or react-i18next)
  3. Localize the Discover feed — allow filtering/sourcing Discover content in the preferred language

Why it matters

Vane positions itself as a privacy-first, self-hosted alternative to Perplexica/Perplexity. Many self-hosters are non-English speakers. Without language support, the tool feels incomplete for a significant portion of its potential userbase.

Workaround (current)

As a workaround, it is possible to create a custom Ollama model with a French system prompt using the Ollama API from + system fields. While functional, this is not user-friendly and does not address the UI or Discover localization.

Environment

  • Vane version: slim-latest (1.12.1)
  • Deployment: Docker, self-hosted
  • LLM provider: Ollama (local)

Thanks for the great project! 🙏

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions