-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Issues: huggingface/peft
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Bug in
get_peft_model_state_dict
when using vblora
#2302
opened Dec 31, 2024 by
KaiyangLi1992
1 of 4 tasks
How to pass in an attention _ mask that is one dimension more than input _ ids
#2301
opened Dec 31, 2024 by
Chinesehou97
2 of 4 tasks
Error of load_adapter of Target module is not supported when using Qwen2-VL
#2296
opened Dec 24, 2024 by
bigmouthbabyguo-530
1 of 4 tasks
PEFT model doesn't update params when having changed LoRA config
#2295
opened Dec 23, 2024 by
d-kleine
4 tasks done
Cannot import name 'EncoderDecoderCache' from 'transformers'
#2292
opened Dec 21, 2024 by
Huang-jia-xuan
4 tasks
Inconsistent Parameter Mismatches After Merging PEFT and Base Models
#2289
opened Dec 19, 2024 by
enhulu-ms
2 of 4 tasks
TypeError when inference with different LoRA adapters in the same batch
#2283
opened Dec 15, 2024 by
yuxiang-guo
2 of 4 tasks
Incompatibility of X-LoRA and MistralForSequenceClassification
#2281
opened Dec 13, 2024 by
cyx96
2 of 4 tasks
Different Results When Predicting with Multiple LoRA Adapters in a Loop VS. Using only One LoRA
#2270
opened Dec 10, 2024 by
beyondguo
4 tasks
Can't PromptTuning in Multi-GPU with DeepSpeed and Qwen2.5-14B-Instruct
#2266
opened Dec 9, 2024 by
dongshou
2 of 4 tasks
Guidance Needed on Two-Stage Fine-Tuning with LoRA(SFT and DPO) for Model Adaptation
#2264
opened Dec 6, 2024 by
none0663
Could you provide example code for AdaLoRA finetuning decoder-only model?
#2262
opened Dec 5, 2024 by
SpeeeedLee
Is it possible to support the transformer engine when using Lora in Megatron?
#2260
opened Dec 5, 2024 by
liulong11
Adapter name conflict with tuner prefix leads to unclear warning during model loading
#2252
opened Dec 3, 2024 by
pzdkn
2 of 4 tasks
Request for adding the lora implementation for Conv1d rather than transormers.utils.Conv1d
contributions-welcome
#2241
opened Nov 28, 2024 by
HelloWorldLTY
TypeError: LoraConfig.__init__() got an unexpected keyword argument 'exclude_modules'
#2208
opened Nov 9, 2024 by
imrankh46
4 tasks done
modules_to_save Incorrect Overlap in Multiple LoRA Adapters
#2206
opened Nov 8, 2024 by
saeid93
2 of 4 tasks
RuntimeError: element 0 of tensors.. OpenCLIP model
#2200
opened Nov 5, 2024 by
EngEmmanuel
2 of 4 tasks
Previous Next
ProTip!
Updated in the last three days: updated:>2024-12-31.