Code: https://github.com/priyammaz/PyTorch-Adventures/tree/main/PyTorch%20Tools/LoRA
LoRA has become a crucial mechanism that enables finetuning extremely large models. If you want to use this look no further than PEFT from Huggingface 🤗, but today we will build a minimal version that allows us to apply LoRA to any model!
We will specifically build LoRA layers for Linear Layers, Convolutional Layers and Embedding Layers, which should give a good idea of how LoRA can be used! We will then apply it to finetuning a RoBERTa model and ViT model for classification tasks!
Timestamps:
00:00:00 Introduction
00:01:02 What is LoRA?
00:18:40 LoRA Base Layer
00:24:30 LoRALinear
00:45:00 LoRAEmbedding
00:59:30 LoRAConv2d
01:24:00 Converting a Model to a LoRA Model
02:01:40 Saving w/ Weight Merging
02:14:50 Testing LoRA
02:17:12 Where's the Memory Savings??
02:18:40 Gradient Checkpointing
02:25:10 Results
Socials!
X https://twitter.com/data_adventurer
Instagram https://www.instagram.com/nixielights/
Linkedin https://www.linkedin.com/in/priyammaz/
🚀 Github: https://github.com/priyammaz
🌐 Website: https://www.priyammazumdar.com/