You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was testing the example in the link @ekzhu provided in the issue. Instead of testing the example as it is, I was testing with model_client_stream=True. There was an error below.
"""
autogen/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1786, in validate_response_format raise TypeError( TypeError: You tried to pass a BaseModelclass tochat.completions.create(); You must use beta.chat.completions.parse() instead
"""
What did you expect to happen?
It should support model_client_stream=True
How can we reproduce it (as minimally and precisely as possible)?
frompydanticimportBaseModelfromtypingimportLiteral# The response format for the agent as a Pydantic base model.classAgentResponse(BaseModel):
thoughts: strresponse: Literal["happy", "sad", "neutral"]
model_client=OpenAIChatCompletionClient(
model="gemini-2.0-flash-exp",
base_url="https://generativelanguage.googleapis.com/v1beta/openai/",
api_key="AIzaSyCnEtytHlZlF1Z0J7-Gx3scxNX8xaw-9Qc",
model_info={
"vision": True,
"function_calling": True,
"json_output": True,
"family": "unknown",
},
response_format=AgentResponse,
)
agent=AssistantAgent(
name="Agent",
model_client=model_client,
tools=tools,
#system_message= "You are an assistant with some tools that can be used to answer some questions",system_message="Categorize the input as happy, sad, or neutral following the JSON format.",
model_client_stream=True,
)
asyncformessageinagent.on_messages_stream([TextMessage(content="I am happy.", source="user")], CancellationToken()):
print(f"{message.}", end=" ")
AutoGen version
0.4.7
Which package was this bug in
AgentChat
Model used
No response
Python version
No response
Operating system
No response
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered:
ekzhu
changed the title
TypeError( TypeError: You tried to pass a BaseModelclass tochat.completions.create(); You must use beta.chat.completions.parse() instead
OpenAIChatCompletionClient's create_stream is not compatible with structured output
Feb 19, 2025
Should be updated to use parse() instead of create() when there is a BaseModel in the response_format field. See the implementation of create for reference:
What happened?
I was testing the example in the link @ekzhu provided in the issue. Instead of testing the example as it is, I was testing with
model_client_stream=True
. There was an error below."""
autogen/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1786, in validate_response_format raise TypeError( TypeError: You tried to pass a BaseModelclass tochat.completions.create(); You must use beta.chat.completions.parse() instead
"""
What did you expect to happen?
It should support
model_client_stream=True
How can we reproduce it (as minimally and precisely as possible)?
AutoGen version
0.4.7
Which package was this bug in
AgentChat
Model used
No response
Python version
No response
Operating system
No response
Any additional info you think would be helpful for fixing this bug
No response
The text was updated successfully, but these errors were encountered: