-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
StreamData response Lags #1963
Comments
@lgrammel any pointers? |
We're facing this issue too, it's happening on all models, even without langchain. When we reverted back the ai package from v3 to v2, then the streaming works fine, no lag. |
@amadk do you have a minimal reproduction that you could share? |
ok it seems to be working fine with the minimal reproduction, like with just pure Nextjs (page router) and ai sdk, no library or anything. We're going to try upgrading our UI library and refactoring the components on the chat page to see if it helps. It was just strange because the ai SDK v2 worked fine with our current setup. Will investigate a bit more and send updates. |
Any update on this ? Happening on our end as well, can't seem to get it to work! |
Hi sorry for the super late response. the problem started occurring for us when we used react-syntax-highlighter with ai SDK. I think the problem is not that ai sdk is laggy, it's actually super fast, the problem is it causes the components to re-render so fast (especially with the fast models like claude sonnet), the page begins to freeze and becomes laggy. So we added a throttle to fix it:
|
@amadk Great insight, thanks! |
Description
Hi! I'm experiencing an issue with stream data response. So mainly there are two issues:
sources
but same data is repeated multiple times as new entries in returned array.chatMode
totext
and the endpoint was not adding any streamData, it was working perfectly fine without any issues.Versions:
"ai": "^3.1.30",
"@ai-sdk/openai": "^0.0.9",
"next": "14.1.4",
"@langchain/community": "^0.0.53",
"@langchain/core": "^0.1.61",
"@langchain/openai": "^0.0.28",
Code example
Here's part of the client side component:
Update
When stream data is removed from the response like below it still lags. So the problem might be related to LangchainAdapter or
stream-data
in theuseChat
hook.Additional context
No response
The text was updated successfully, but these errors were encountered: