-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[llm_load_test] Make more generic for testing on other platforms #672
base: main
Are you sure you want to change the base?
Conversation
[APPROVALNOTIFIER] This PR is NOT APPROVED This pull-request has been approved by: The full list of commands accepted by this bot can be found here.
Needs approval from an approver in each of these files:
Approvers can indicate their approval by writing |
6cb5f64
to
78bae64
Compare
…command and disabling the timeout on MacOS (command not provided by the OS ...)
…onfig.yaml: pass the model_id to openai_plugin config and preserve the historical behavior (for kserve/vllm)
78bae64
to
5b390a2
Compare
Hello @mcharanrm @dagrayvid , could we get this PR tested to confirm that it doesn't break anything on the KServe testing? would be nice if we could have it merged asap |
@@ -22,7 +22,7 @@ plugin_options: | |||
{% if llm_load_test_run_plugin == 'openai_plugin' %} | |||
host: "{{ llm_load_test_run_interface }}://{{ llm_load_test_run_host}}:{{ llm_load_test_run_port}}" | |||
endpoint: "{{ llm_load_test_run_endpoint }}" | |||
model_name: "/mnt/models/" | |||
model_name: "{{ llm_load_test_run_model_id }}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think you can move this up out of the if statement
Relocating the llm-load-test visualization to a more logical place.
(I want to use the
llm-load-test
project outside of thekserve
project)/cc @mcharanrm