You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a question regarding the temperature parameter in the Hugging Face Inference API, particularly in the context of chat models. According to the documentation, the default value for temperature is 1. However, I noticed that some models seem to have a different default, such as 0.6, as specified in their generation_config.json file.
Here are my questions:
When using the Inference API, if I don’t explicitly set the temperature parameter, does the API always use the model’s default value from the generation_config.json? Or does it fall back to a global default of 1 as mentioned in the docs?
If I don’t pass in any additional parameters (like max_length, top_p, etc.), does the API automatically use all the defaults specified in the model’s generation_config.json file? Or are there other fallback defaults from the API side?
Thank you in advance for your help!
The text was updated successfully, but these errors were encountered:
Hi everyone,
I have a question regarding the temperature parameter in the Hugging Face Inference API, particularly in the context of chat models. According to the documentation, the default value for temperature is 1. However, I noticed that some models seem to have a different default, such as 0.6, as specified in their generation_config.json file.
Here are my questions:
Thank you in advance for your help!
The text was updated successfully, but these errors were encountered: