Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fully localized suggestion #107

Open
AINXTGENStudio opened this issue Jun 9, 2024 · 2 comments
Open

Fully localized suggestion #107

AINXTGENStudio opened this issue Jun 9, 2024 · 2 comments
Assignees

Comments

@AINXTGENStudio
Copy link

I believe this project has huge potential to have an option to be fully local and private. For example adding LM Studio integration to allow any downloaded LLM or VLM like Phi-3 Vision. And integrating xVASynth or XTTS(Local TTS).

@onuratakan onuratakan self-assigned this Jun 14, 2024
@onuratakan
Copy link
Member

Hi thank you so much <3, Actualy there is an Ollama integration for now. But yeah maybe LM studio can be good also.

@AINXTGENStudio
Copy link
Author

Thanks for the consideration as most of these UIs are actually using llama.cpp just like ollama. LM Studio's UI though is very user friendly and has access to huggingface URL for any gguf model and quant variants, LM also has a local server feature that your project could connect to, for example they use: http://localhost:1234/v1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants