LoRA

Oct 24, 2024·
Jiyuan (Jay) Liu
Jiyuan (Jay) Liu
· 1 min read

LoRA From Scratch: Finetuning Large Models for Cheaper Github

LoRA is to replace a big 2D matrix into the multiplication of two small 2D matrices. It is used to fine tune based on already trained large models, so the big matrix replaced is the delta matrix representig the difference between the original and the fine-tuned model.