Table of Contents
0:00 Recap
0:40 A simple example (more in the practicals)
3:44 Pytorch tensor: requires_grad field
6:44 Pytorch backward function
9:05 The chain rule on our example
16:00 Linear regression
18:00 Gradient descent with numpy...
27:30 ... with pytorch tensors
31:30 Using autograd
34:35 Using a neural network (linear layer)
39:50 Using a pytorch optimizer
44:00 algorithm: how automatic differentiation works
notebook used in the video for the linear regression. If you want to open it in colab
Bonus: JAX implementation of the linear regression notebook in colab
backprop slide (used for the practical below)
practicals in colab Coding backprop.