Skip to content

This is the implementation of low rank adaptation (LoRA) which is a subset of parameter efficient fine tuning (PEFT).

Notifications You must be signed in to change notification settings

Jayveersinh-Raj/LoRA_implementation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Low Rank Adaptation (LoRA) implementation

This is the implementation of low rank adaptation (LoRA) which is a subset of parameter efficient fine tuning (PEFT).

Papers references:

LoRA

Effectiveness of Language Model Fine-Tuning

Intrinsic Dimensionality Paper

About

This is the implementation of low rank adaptation (LoRA) which is a subset of parameter efficient fine tuning (PEFT).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published