-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gemini models via Openrouter not supported #5621
Comments
@ekzhu do you know if OpenRouter presents all models as openai compatible or is gemini different? |
@jackgerrits - openrouter's claim to fame is that they provide a unified API and all models can be accessed via an openai compatible API schema. https://openrouter.ai/docs/quickstart Another verbatim text from this link - https://openrouter.ai/openai/o1/api They do say "very similar" but there is a reason why AI community is doubling down on openrouter and LiteLLM..because we want single interface for all AI models and make integrations model agnostic. Hope this helps. In case you find that Gemini's response is not openai compatible then do print the logs here and i will log a bug with openrouter. however it seems to me that it is not even about api response format. thr is some other problem of mapping in autogent copy-pasta :) the exact error again from this ticket so that you can look for it in your code base. Waiting for this fix! Let's keep building!! Hyped with MS putting best minds of world on this.. so doubling down on Autogen while back stabbing langhain, crewai and smolagents... lfg! |
Yeah, I think this warning is probably okay, but I could be wrong here. The error is in the issue indicates that I don't have access to open router at the moment, so I will wait to see what @ekzhu thinks. |
I am getting the same error from OpenRouter when using Claude models. But it works with OpenAI models. See my response in #5583 At this point I don't know what's the cause of it. |
@ekzhu i am hoping that you are changing the family in your code when trying with Claude in the code snippet you pasted here - #5583 Coz Claude works just fine. This is the config that works:
ignore "api_type" key. Let me know if using above as well doesnt work for you for Claude models. Like i said, Claude works, Gemini doesnt. So we need to be on same page in terms of "reproducibility" of this issue else it will die a slow death and so would my project :D Awaiting for your response on this... |
It's more about the model name rather than the model family. Have you tried with just calling open router directly using the |
I had the same error and inspected the
|
I looked a bit more. I think the problem is:
These were the messages sent to the LLM when I had the error:
|
What happened?
Following code snippet works
Above code works well when we change
"model": "anthropic/claude-3.5-sonnet" -> model ="openai/gpt-4o-2024-11-20"
and
"family": "claude-3.5-sonnet" -> to "family": "gpt-4o"
However, when i change model to gemini flash from here - https://openrouter.ai/google/gemini-2.0-flash-001
ie "model" : "google/gemini-2.0-flash-001"
and
"family" : "gemini-2.0-flash" (picked up from https://microsoft.github.io/autogen/stable//reference/python/autogen_core.models.html#autogen_core.models.ModelInfo) code fails as following. Tried with family "unknown" as well.
Which packages was the bug in?
Python AgentChat (autogen-agentchat>=0.4.0)
AutoGen library version.
Python 0.4.7
Other library version.
No response
Model used
gpt4o, sonnet 3.5, gemini flash 2.0
Model provider
OpenRouter
Other model provider
No response
Python version
3.12
.NET version
None
Operating system
MacOS
The text was updated successfully, but these errors were encountered: