MENU

Fun & Interesting

Relative Position Bias (+ PyTorch Implementation)

Soroush Mehraban 4,944 lượt xem 2 years ago
Video Not Working? Fix It Now

In this video, I explain why position embedding is required in vision transformers, what's the limitation of using absolute position embedding, and how relative position bias can improve that.

Table of Content:
00:00 Permutation Equivariance
01:12 Absolute Position Embedding
02:42 Limitation of absolute positions
03:56 Relative Position Bias intuition
07:57 Relative Position Bias in theory
12:53 PyTorch Implementation

Icon made by Freepik from flaticon.com

Comment