setup max tokens in HuggingFaceInferenceAPI #14243
ragesh2000
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Iam using HuggingFaceInferenceAPI as my llm for my service. But i don't see a parameter to control the max new tokens in HuggingFaceInferenceAPI. Is there any such parameter ?
Beta Was this translation helpful? Give feedback.
All reactions