MENU

Fun & Interesting

Graph Attention Network (GAT) from scratch. Forward pass using pytorch. Part 01. GCN to 1-head-GAT.

ajegorovs 2,183 1 year ago
Video Not Working? Fix It Now

This is part 1 of two part series on Graph Attention Networks. (Suggestion: I talk slow. View on x1.5 speed) In this video we will discuss: 1) General ideas about graphs and difficulty of their application in neural networks. 2) The meaning of convolution operation for images and graphs. 3) I introduce small linear algebra 'hacks' to help our interpretation of mathematical operations in GNNs. 4) Main ideas of Graph Convolution Networks (GCN), how to build an aggregation function using adjacency matrix. 5) How to improve GCN by introducing attention mechanism. Links: OG Paper: https://arxiv.org/pdf/1710.10903.pdf Notebooks: github.com/ajegorovs/aj_python_tool_lib/tree/main/data_processing/neural_networks/Graph_neural_networks Pytorch implementation @labmlai: nn.labml.ai/graphs/gat/index.html Latter was used as a base case. Its probably not the best approach since we are using adjacency matrix which is very sparse. But it could be improved by transitioning to sparse matrices and use torch.sparse.mm() matrix multiplication. Part 02: https://www.youtube.com/watch?v=QiyB4T-rr_0

Comment