Module 2b - Automatic differentiation

Table of Contents

Automatic differentiation


0:00 Recap
0:40 A simple example (more in the practicals)
3:44 Pytorch tensor: requires_grad field
6:44 Pytorch backward function
9:05 The chain rule on our example
16:00 Linear regression
18:00 Gradient descent with numpy...
27:30 ... with pytorch tensors
31:30 Using autograd
34:35 Using a neural network (linear layer)
39:50 Using a pytorch optimizer
44:00 algorithm: how automatic differentiation works

Slides and Notebook

Quiz

To check your understanding of automatic differentiation, you can do the quizzes

Practicals

Challenge

Adapt your code to solve the following challenge:

Some small modifications:

Bonus: