Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Baichuan2-7B-Chat #1987

Merged
merged 2 commits into from
Dec 8, 2023
Merged

Fix Baichuan2-7B-Chat #1987

merged 2 commits into from
Dec 8, 2023

Conversation

firebook
Copy link
Contributor

@firebook firebook commented Dec 8, 2023

This PR fix Baichuan2-7B-Chat model, use "ROPE" instead of "ALIBI" when load Baichuan2-7B-Chat model

relate issue: #1561 #1092 #1085

before fix:

curl http://127.0.0.1:8080/v1/completions \
> -H "Content-Type: application/json" \
> -d '{
>     "model": "Baichuan2-7B-Chat",
>     "prompt": "<reserved_106>How are you?<reserved_107>",
>     "temperature": 0,
>     "max_tokens": 300,
>     "stream": false
> }'
{
"id":"cmpl-a6ae13d369714779a2e65d8220e00413",
"object":"text_completion","created":3634794,"model":"Baichuan2-7B-Chat",
"choices":[{"index":0,"text":"As a language model, I am a language agent","logprobs":null,"finish_reason":"stop"}],
"usage":{"prompt_tokens":6,"total_tokens":17,"completion_tokens":11}
}

after fix:

curl http://127.0.0.1:8080/v1/completions -H "Content-Type: application/json" -d '{
    "model": "Baichuan2-7B-Chat",
    "prompt": "<reserved_106>How are you?<reserved_107>",
    "temperature": 0,
    "max_tokens": 300,
    "stream": false
}'
{
"id":"cmpl-cadc09ae25294917a9e2d1d986475957",
"object":"text_completion","created":3635047,"model":"Baichuan2-7B-Chat",
"choices":[{"index":0,"text":"I'm fine, thank you! How about you?","logprobs":null,"finish_reason":"stop"}],
"usage":{"prompt_tokens":6,"total_tokens":19,"completion_tokens":13}}

@simon-mo
Copy link
Collaborator

simon-mo commented Dec 8, 2023

Thank you for the fix!

@simon-mo simon-mo merged commit 2b98101 into vllm-project:main Dec 8, 2023
2 checks passed
Comment on lines +369 to +370
class BaichuanForCausalLM(BaiChuanBaseForCausalLM
): # baichuan 13b, baichuan2 13b, baichuan2 7b
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we make this into two separate lines? This doesn't look good stylistically 😢

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think this is how yapf wanted it. but once we switch to black formatting this will be waaaaaaay better

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can stamp this PR: #1688 to make it happen!

hongxiayang pushed a commit to hongxiayang/vllm that referenced this pull request Feb 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants