-
Notifications
You must be signed in to change notification settings - Fork 498
Issues: mlcommons/inference
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
mixtral: accuracy check output contains
np.float64(...)
which doesnt suit metric regex
#1763
opened Jul 2, 2024 by
viraatc
Running Mixtral is producing the below warning - I guess the evaluation of the accuracy logs is now completed
#1757
opened Jun 26, 2024 by
arjunsuresh
Mixtral mlperf log file is having a new line which fails when loaded with the mlperf logger
#1756
opened Jun 26, 2024 by
arjunsuresh
Automated command for llama2-70b: Cannot take a larger sample than population
#1755
opened Jun 26, 2024 by
philross
Running Automated command for llama2-70b without downloading the model
#1747
opened Jun 25, 2024 by
philross
CM running failed when cloning from https://github.com/GATEOverflow/inference_results_v4.0.git
#1746
opened Jun 25, 2024 by
Bob123Yang
Get error message "unrecognized arguments: rocm" when running mlperf inference on ubuntu with rocm
#1729
opened Jun 12, 2024 by
jerryzhaoc
Terminal disppeared after rnnt inference running without any error (attach log)
#1712
opened May 25, 2024 by
Bob123Yang
CM error: no scripts were found with above tags and variations
#1709
opened May 23, 2024 by
sunpian1
Are there examples of single-node multi-GPU and multi-node multi-GPU
#1708
opened May 23, 2024 by
llxyw
Submission checker forces to have “gptj-99” and “gpt-99.9” in the code dir
#1675
opened Apr 9, 2024 by
szutenberg
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.