-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update llama-run to include temperature option #10899
base: master
Are you sure you want to change the base?
Conversation
ca259bd
to
cd61ea0
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tbh I'm not sure what's the long-term plan for llama-run
.
My thought is that if now we add --temp
, I'm pretty sure someone will also add other sampling params like top-k, top-p, DRY, etc in the near future, to a point that it will defeat the initial goal of llama-run
which is "just run".
cd61ea0
to
ef5d16f
Compare
I had a use case for --temp. If somebody has a use case for extra arguments, I don't have an immediate issue with merging them. Yes, I'd hope it would be less complex than llama-cli, but personally I have no issue with people adding extra args if they need them. |
This commit updates the `examples/run/README.md` file to include a new option for setting the temperature and updates the `run.cpp` file to parse this option. Signed-off-by: Eric Curtin <[email protected]>
ef5d16f
to
d0c0945
Compare
But of course, simple use cases like:
should continue to work. |
This should be an easy review @slaren @ggerganov |
This commit updates the
examples/run/README.md
file to include a new option for setting the temperature and updates therun.cpp
file to parse this option.