-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bug] parsing issues in frontend #166
Comments
We have reproduced and this typically happens when the LLM returns a phone number. |
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this issue will be closed. |
I was also facing this issue, getting a response splitted in bullets. While trying to solve the issue, I found out that disabling the "Stream chat completion responses" option somehow solves the issue. I hope it can help somehow |
Thank you for the context. We're evaluating moving that responsibility to the backend. Earlier when we delivered this sample, that was not possible, but we always planned to migrate the parser for performance reasons. I will reprioritize. |
any update on the issue? |
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this issue will be closed. |
Sometime the parser fails at parsing some answers, for example for the question "How to contact a representative?" I get this:
This also occurs quite often when trying to cite sources when the backend is switched to the Python sample:
This issue is for a: (mark with an
x
)The text was updated successfully, but these errors were encountered: