AI Tools Streaming Responses via Python #175
Replies: 2 comments 1 reply
-
Looking for the exact same solution. The requirements of my clients are such that I have to use Flask at the backend. WIll I can do all the streaming responses for the regular chatbot, having the vercel ai sdk, makes it really easy to handle different stages in the rag application. For example, you can change the ui when the model is invoking a tool or something like that. |
Beta Was this translation helpful? Give feedback.
-
You need to implement https://sdk.vercel.ai/docs/ai-sdk-ui/stream-protocol#data-stream-protocol from the Python side. For tool call streaming specifically, you first need to send a tool call streaming start and then tool call deltas (inside a step). |
Beta Was this translation helpful? Give feedback.
-
Is it possible to make a Python Flask app operate the same way as a next.js app does with Edge functions?
We have a query that goes out to OpenAI API that is designed to stream responses back as it is sent, however that solution obviously did not work as a Serverless function, and from what I can gather, Edge functions have no way of interacting with a Flask app the same way. The new SDK doesn't seem to offer a solution to this either, unless I'm missing something very obvious.
Any ideas are welcome, thank you.
Beta Was this translation helpful? Give feedback.
All reactions