Replies: 1 comment 1 reply
-
|
Hey @ratkins, .completion -> unifies to OpenAI's /chat/completions API .text_completion -> unifies to OpenAI's /completions API .responses -> unifies to OpenAI's /responses API Does this help? Where in docs / readme would this have been helpful to have? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
A cursory look at the Github front page leaves me with the impression that it provides a unified interface to multiple (hosted and local) LLMs.
Deeper investigation into the docs seems to imply there are different LiteLLM APIs for different model providers. I am confused.
(What doesn't help is that I can't find rendered versions of the Python SDK API docs. What types does
completion()return? When should I usecompletion()and when should I usetext_completion()are both questions I can't find answers to.)Beta Was this translation helpful? Give feedback.
All reactions