You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some model scripts, such as https://github.com/onnx/turnkeyml/blob/main/models/transformers/llama2_7b.py, allow the user to pass custom model weights by path. Currently those paths are completely arbitrary, so a user could, for example, accidentally pass LLaMA-2-13b weights to llama2_7b.py, which would give unintended results.
Proposed solution: require some kind of string matching in the model path. For exmaple, all LLaMA-2-7b checkpoints could be required to have "LLaMA-2-7b" or "7b" in the --model_path.
Bonus solution: allow --model_path to be generic with respect to the specific model size. For example, "meta-llama/Llama-2-*b-hf" could load the 7b weights in llama2_7b.py, the 13b weights in llama2_13b.py, etc.
Some model scripts, such as https://github.com/onnx/turnkeyml/blob/main/models/transformers/llama2_7b.py, allow the user to pass custom model weights by path. Currently those paths are completely arbitrary, so a user could, for example, accidentally pass LLaMA-2-13b weights to
llama2_7b.py
, which would give unintended results.Proposed solution: require some kind of string matching in the model path. For exmaple, all LLaMA-2-7b checkpoints could be required to have "LLaMA-2-7b" or "7b" in the
--model_path
.Bonus solution: allow
--model_path
to be generic with respect to the specific model size. For example, "meta-llama/Llama-2-*b-hf" could load the 7b weights in llama2_7b.py, the 13b weights in llama2_13b.py, etc.cc @danielholanda
The text was updated successfully, but these errors were encountered: