Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Suggestion] - Add "streaming" flag as a setting parameter #63

Open
gtrainar opened this issue Dec 5, 2024 · 1 comment
Open

[Suggestion] - Add "streaming" flag as a setting parameter #63

gtrainar opened this issue Dec 5, 2024 · 1 comment

Comments

@gtrainar
Copy link

gtrainar commented Dec 5, 2024

I'm using the MacOS app, and it works like a charm with LM Studio. As it usually takes many seconds for the text to display, I'd like to see the progress. LM Studio offers the possibility to stream the content, and it would be a plus if you could activate/deactivate this option in the Settings menu.

@Aryamirsepasi
Copy link
Collaborator

I'm using the MacOS app, and it works like a charm with LM Studio. As it usually takes many seconds for the text to display, I'd like to see the progress. LM Studio offers the possibility to stream the content, and it would be a plus if you could activate/deactivate this option in the Settings menu.

Hi, thank you for your suggestion! I’m glad to hear you’re enjoying the app. 😊 That’s a great idea, and I’ll look into it. For now, the next version will feature a loading animation while waiting for a response. However, streaming the text would definitely provide a more seamless experience.
The challenge is that the app’s logic currently relies on sending the input to the AI and pasting the final response. To implement streaming, I’ll need to figure out how to integrate it smoothly.🤔

Thanks again for your feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants