-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: llm merge_lora_params 合并后不保存 merge权重 #8575
Comments
嗯嗯,Killed不排除你环境问题,可能是有别人也在使用机器。 |
我知道问题在哪了。带上 model_name_or_path 字段就不能正常保存,只会存一个 json; 去掉后就可以正常 merge, Qlora应该不会对这个有影响吧;感觉是 字段导致的问题 |
This issue is stale because it has been open for 60 days with no activity. 当前issue 60天内无活动,被标记为stale。 |
This issue was closed because it has been inactive for 14 days since being marked as stale. 当前issue 被标记为stale已有14天,即将关闭。 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
软件环境
重复问题
错误描述
稳定复现步骤 & 代码
python merge_lora_params.py
--model_name_or_path FlagAlpha/Llama2-Chinese-7b-Chat
--lora_path /home/aistudio/data/checkpoints/llama_lora_ckpts/checkpoint-286
--merge_lora_model_path /home/aistudio/data/llama_lora_merge
--device "gpu"
--low_gpu_mem True
似乎一直卡在加载的阶段,然后过一阵子后直接结束进程。(怀疑内存不够,但应该不至于吧 ,aistudio 32g v100 开发机)
但并非是 lora 问题,因为可以动态图加载推理
The text was updated successfully, but these errors were encountered: