-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Fix AttributeError in streaming response cleanup #4236
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
mattf
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@r-bit-rry a chain of hasattr like this suggests we've done something wrong in the design. have we or can we just call close?
It really comes down to what we want to support, since this was never strictly typed, I'm assuming there are other objects that can be generated by the sse_generator. and on a more serious note @mattf so its our decision if we want to enforce certain typings, and act upon, or let this pattern "catch all" |
mattf
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
as propose, the hasattr chain will cover up an api contract bug somewhere in the system.
an AsyncStream is making it to a place where only AsyncIterators should be.
i did a little sleuthing and i think there is a bug in at least _maybe_overwrite_id. there are multiple provider impls, so there may be others.
will you find the places where the api contract is being violated and patch them?
also, will you create a regression test that at least tests the openai mixin provider?
|
@mattf sure thing, I'll start working on those |
|
@mattf We're facing two options in order to avoid hasattr chain as I see it when treating the this is explicit and simple but carries a small overhead per chunk option 2: adapter pattern direct delegation with no re-yielding and a more explicit intent regarding locations of violations, where we will need patching, these are the places I was able to spot:
returned AsyncStream (has close()) instead of AsyncIterator (has aclose())
Returned raw client.chat.completions.create() response
Returned raw client.completions.create() response
Returned raw litellm.acompletion() result
|
|
@r-bit-rry great finds! it looks like we're violating the api contract and using |
This PR fixes issue #3185
The code calls
await event_gen.aclose()but OpenAI'sAsyncStreamdoesn't have anaclose()method - it hasclose()(which is async).when clients cancel streaming requests, the server tries to clean up with:
But
AsyncStreamhas never had a publicaclose()method. The error message literally tells us:Verification
reproduce_issue_3185.shcan be used to verify the fix.