Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StaticLLMPipeline: Enable chat test #1117

Open
wants to merge 16 commits into
base: master
Choose a base branch
from

Conversation

TolyaTalamanov
Copy link
Collaborator

No description provided.

@github-actions github-actions bot added the category: sampling Sampling / Decoding algorithms label Oct 31, 2024
@ilya-lavrenov ilya-lavrenov added this to the 2024.5 milestone Nov 1, 2024
@ilya-lavrenov ilya-lavrenov added category: LLM LLM pipeline (stateful, static) and removed category: sampling Sampling / Decoding algorithms labels Nov 1, 2024
@pytest.mark.precommit
@pytest.mark.nightly
def test_chat_generation(model_descr):
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@github-actions github-actions bot added category: sampling Sampling / Decoding algorithms and removed category: LLM LLM pipeline (stateful, static) labels Nov 1, 2024
@TolyaTalamanov
Copy link
Collaborator Author

TolyaTalamanov commented Nov 1, 2024

Any idea why did this happen?

RuntimeError: Exception from src\inference\src\cpp\core.cpp:85:
Check 'util::directory_exists(path) || util::file_exists(path)' failed at src\frontends\common\src\frontend.cpp:117:
FrontEnd API failed with GeneralFailure:
ir: Could not open the file: "Qwen2-0.5B-Instruct\openvino_tokenizer.xml"

@Wovchena
Copy link
Collaborator

I don't have a solution to ir: Could not open the file: "Qwen2-0.5B-Instruct\openvino_tokenizer.xml"

questions = [
'1+1=',
'What is the previous answer?',
'Why is the Sun yellow?',
'What was my first question?'
]

model_path = get_chat_models_lists()[0][1]
model_path = get_chat_models_list()[0][1]
Copy link
Contributor

@ilya-lavrenov ilya-lavrenov Nov 19, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's not a model path, it's a model_id. E.g. from CI error we can see:

model_path = WindowsPath('Qwen2-0.5B-Instruct'), device = 'CPU'

which means model is not even converted by Optimum

In other places, it's used like:

    pipe = read_model(get_models_list()[0])[4]

where read model converts model and created pipe on to top it.

So, the question is - have you run tests locally? do they even magically pass?

@ilya-lavrenov ilya-lavrenov added this to the 2025.0 milestone Nov 19, 2024
@ilya-lavrenov ilya-lavrenov removed the category: sampling Sampling / Decoding algorithms label Nov 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants