You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
直接下载ZIP包请提供下载日期以及压缩包注释里的git版本(可通过7z l zip包路径命令并在输出信息中搜索Comment 获得,形如Comment = bc80b11110cd440aacdabbf59658d630527a7f2b)。 git clone请提供 git commit 第一行的commit id
Provide date (or better yet, git revision from the comment section of the zip. Obtainable using 7z l PATH/TO/ZIP and search for Comment in the output) if downloading source as zip,otherwise provide the first commit id from the output of git commit
请在这里粘贴cmake参数或使用的cmake脚本路径以及完整输出
Paste cmake arguments or path of the build script used here as well as the full log of the cmake proess here or pastebin
报错:
💥 Failed load pretrained model
Traceback (most recent call last):
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2291, in load_pretrained
self.model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype='auto', trust_remote_code=True).eval()
File "/home/fg/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File "/home/fg/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4010, in from_pretrained
with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 44, in wrapper
result = func(*args, **kwargs)
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2306, in load_model
self.load_pretrained(model_path)
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2293, in load_pretrained
self.model = AutoModel.from_pretrained(model_path, torch_dtype='auto', trust_remote_code=True).eval()
File "/home/fg/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File "/home/fg/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4010, in from_pretrained
with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
补充:wsl ubuntu分配的是32G内存
The text was updated successfully, but these errors were encountered:
平台(如果交叉编译请再附上交叉编译目标平台):
Platform(Include target platform as well if cross-compiling):
wsl ubuntu2204
Github版本:
Github Version:
commit 2b899c1 (HEAD -> master, origin/master, origin/HEAD)
直接下载ZIP包请提供下载日期以及压缩包注释里的git版本(可通过
7z l zip包路径
命令并在输出信息中搜索Comment
获得,形如Comment = bc80b11110cd440aacdabbf59658d630527a7f2b
)。 git clone请提供git commit
第一行的commit idProvide date (or better yet, git revision from the comment section of the zip. Obtainable using
7z l PATH/TO/ZIP
and search forComment
in the output) if downloading source as zip,otherwise provide the first commit id from the output ofgit commit
编译方式:
Compiling Method
cd MNN/pymnn/pip_package
python3 build_deps.py llm
python3 setup.py install --deps llm --prefix=/mnt/d/workspace/MNN/install/python/
编译日志:
Build Log:
问题:
我参考下面文档,测试export 工具,
https://mnn-docs.readthedocs.io/en/latest/transformers/llm.html
我的命令是:
git clone https://www.modelscope.cn/qwen/Qwen2-0.5B-Instruct.git
存放到/mnt/d/workspace/llm-models/Qwen2-0.5B-Instruct
cd ./transformers/llm/export
python3 llmexport.py --path /mnt/d/workspace/llm-models/Qwen2-0.5B-Instruct/ --export mnn
报错:
💥 Failed load pretrained model
Traceback (most recent call last):
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2291, in load_pretrained
self.model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype='auto', trust_remote_code=True).eval()
File "/home/fg/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File "/home/fg/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4010, in from_pretrained
with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 44, in wrapper
result = func(*args, **kwargs)
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2306, in load_model
self.load_pretrained(model_path)
File "/mnt/d/workspace/MNN/transformers/llm/export/llmexport.py", line 2293, in load_pretrained
self.model = AutoModel.from_pretrained(model_path, torch_dtype='auto', trust_remote_code=True).eval()
File "/home/fg/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained
return model_class.from_pretrained(
File "/home/fg/.local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4010, in from_pretrained
with safe_open(resolved_archive_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooLarge
补充:wsl ubuntu分配的是32G内存
The text was updated successfully, but these errors were encountered: