-
Notifications
You must be signed in to change notification settings - Fork 4.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Groq Client #3003
Groq Client #3003
Conversation
Great :) |
Sorry, found a bug when testing logging. Minor but will fix if someone wants to log this (or Together.AI) |
Fixed, thank you, good to go :) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm. thanks @marklysze
@marklysze if this is merged before the blogpost on non-openai models, please also add the Groq Client in the blogpost. |
Good call, will do |
Thanks @qingyun-wu! @Hk669, I've updated the blog #2965. |
This is a Groq Client that provides a way to use the, rather fast, Groq cloud inference service, which provides options to use a few LLMs, such as Meta's Llama 3 and Mistral.AI's Mixtral.
This client supports text generation and function/tool calling (when using certain models as noted by Groq).
Streaming is support, including with function/tool calling.
Token costs are also implemented.
Checks