-
Notifications
You must be signed in to change notification settings - Fork 5.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming the agent team's response #5625
Comments
You can find the docs describing token streaming here: https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/tutorial/agents.html#streaming-tokens Essentially, you need to write logic that will consume the stream |
Please share the sample code for this |
Please see the link I sent: model_client = OpenAIChatCompletionClient(model="gpt-4o")
streaming_assistant = AssistantAgent(
name="assistant",
model_client=model_client,
system_message="You are a helpful assistant.",
model_client_stream=True, # Enable streaming tokens.
)
# Use an async function and asyncio.run() in a script.
async for message in streaming_assistant.on_messages_stream( # type: ignore
[TextMessage(content="Name two cities in South America", source="user")],
cancellation_token=CancellationToken(),
):
print(message)
|
Use Streams and not Http TriggerFrom the code I see above, it seems you are trying to stream output over a http connection in an Azure function. HTTP is stateless and not suitable for streaming. Now I found this documentation on supporting http streams in azure python. Please read it . It seems that your code should look like this import azure.functions as func
from azurefunctions.extensions.http.fastapi import Request, StreamingResponse
app = func.FunctionApp(http_auth_level=func.AuthLevel.ANONYMOUS)
async def stream_messages(result_stream):
"""Convert agent messages to SSE format"""
async for msg in result_stream:
if isinstance(msg, TaskResult):
continue
# Format the message as SSE data
yield f"data: {msg.model_dump()}\n\n"
@app.route(route="chat", methods=[func.HttpMethod.POST])
async def chat_stream(req: Request) -> StreamingResponse:
"""Endpoint to stream chat responses"""
task = req.query_params.get('task') or (await req.json()).get('task')
agent_team = await get_selector_group_chat_team()
result_stream = agent_team.run_stream(task=task, cancellation_token=cancellation_token)
return StreamingResponse(
stream_messages(result_stream),
media_type="text/event-stream"
) Please see our Examples on Streaming to Front endWe have sevaral examples that can help build an understanding of using streams in a frontend app
|
What feature would you like to be added?
How to stream the run_stream output to the front end. I am using this Group Chat Agent in my Azure functions. How to send the streaming output from the function app.
I have tried the 'azurefunctions-extensions-http-fastapi' method, but it is failing due to version conflicts. Please help with this to overcome the issue.
Why is this needed?
For production deployment
The text was updated successfully, but these errors were encountered: