Skip to content

Conversation

@lvliang-intel
Copy link
Collaborator

Description

Fix LLM special token issue.

There is a special token in the llm streaming response, need to ignore it.

data: b'<|im_end|>'

data: [DONE]

Issues

n/a

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds new functionality)
  • Breaking change (fix or feature that would break existing design and interface)
  • Others (enhancement, documentation, validation, etc.)

Dependencies

None

Tests

Local test and CI.

Signed-off-by: lvliang-intel <[email protected]>
Signed-off-by: lvliang-intel <[email protected]>
Signed-off-by: lvliang-intel <[email protected]>
Signed-off-by: lvliang-intel <[email protected]>
@chensuyue chensuyue added this to the v1.1 milestone Nov 13, 2024
@lvliang-intel lvliang-intel merged commit 517a5b0 into opea-project:main Nov 14, 2024
madison-evans pushed a commit to SAPD-Intel/GenAIComps that referenced this pull request May 12, 2025
* Fix LLM special token issue

Signed-off-by: lvliang-intel <[email protected]>

* update code

Signed-off-by: lvliang-intel <[email protected]>

* update logic

Signed-off-by: lvliang-intel <[email protected]>

* update vllm llm

Signed-off-by: lvliang-intel <[email protected]>

---------

Signed-off-by: lvliang-intel <[email protected]>
Co-authored-by: ZePan110 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants