Skip to content

How do I set the local model to chat instead of text generation? #118

Answered by Vali-98
tp1415926535 asked this question in Q&A
Discussion options

You must be logged in to vote

At the moment, local generations only use text completion.

I would recommend looking at your model's stop tokens to properly stop said model from further generations.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@tp1415926535
Comment options

Answer selected by tp1415926535
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants