You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found that OpenAIChatCompletionsModel can only stream out text, but cannot stream out tool_call deltas. But I think streaming is an important feature for application that need instant responses.
There is a comment in the code that "Because we don't know the name of the function until the end of the stream, we'll save everything and yield events at the end", but I don't understand it. Could you clarify the reason why we cannot just yield ResponseFunctionCallArgumentsDeltaEvent (openai.types.responses.response_function_call_arguments_delta_event) in ChatCompletions API?
The text was updated successfully, but these errors were encountered:
So e.g. if your function is called "get_weather", the first delta might have get_ and the second might have weather. So until we receive the last delta, we don't know the name of the function.
I found that
OpenAIChatCompletionsModel
can only stream out text, but cannot stream out tool_call deltas. But I think streaming is an important feature for application that need instant responses.There is a comment in the code that "Because we don't know the name of the function until the end of the stream, we'll save everything and yield events at the end", but I don't understand it. Could you clarify the reason why we cannot just yield
ResponseFunctionCallArgumentsDeltaEvent
(openai.types.responses.response_function_call_arguments_delta_event) in ChatCompletions API?The text was updated successfully, but these errors were encountered: