Code and notebooks for the deep learning course dataflowr. Here is the schedule followed at école polytechnique in 2023:
- Module 1 - Introduction & General Overview Slides + notebook Dogs and Cats with VGG + Practicals (more dogs and cats)
- Module 2a - PyTorch tensors
- Module 2b - Automatic differentiation + Practicals
- MLP from scratch start of HW1
- another look at autodiff with dual numbers and Julia
- Module 3 - Loss function for classification
- Module 4 - Optimization for deep learning
- Module 5 - Stacking layers and overfitting a MLP on CIFAR10: Stacking_layers_MLP_CIFAR10.ipynb
- Module 6: Convolutional neural network
- how to regularize with dropout and uncertainty estimation with MC Dropout: Module 15 - Dropout
- Module 7 - Dataloading
- Module 8a - Embedding layers
- Module 8b - Collaborative filtering and build your own recommender system: 08_collaborative_filtering_empty.ipynb (on a larger dataset 08_collaborative_filtering_1M.ipynb)
- Module 8c - Word2vec and build your own word embedding 08_Word2vec_pytorch_empty.ipynb
- Module 16 - Batchnorm and check your understanding with 16_simple_batchnorm_eval.ipynb and more 16_batchnorm_simple.ipynb
- Module 17 - Resnets and transform your classifier into an out-of-distribution detector with ODIN_mobilenet_empty.ipynb
- start of Homework 2: Class Activation Map and adversarial examples
- Module 9a: Autoencoders and code your noisy autoencoder 09_AE_NoisyAE.ipynb
- Module 10: Generative Adversarial Networks and code your GAN, Conditional GAN and InfoGAN 10_GAN_double_moon.ipynb
- Module 13: Siamese Networks and Representation Learning
- start of Homework 3: VAE for MNIST clustering and generation
- Module 12 - Attention and Transformers
- Correcting the PyTorch tutorial on attention in seq2seq: 12_seq2seq_attention.ipynb
- Build your own microGPT: GPT_hist.ipynb
:sunflower:Session:eight:
- Module 9b - UNets
- Module 9c - Flows
- Build your own Real NVP: Normalizing_flows_empty.ipynb
:sunflower:Session:nine:
- Module 18a - Denoising Diffusion Probabilistic Models
- Train your own DDPM on MNIST: ddpm_nano_empty.ipynb
- Finetuning on CIFAR10: ddpm_micro_sol.ipynb
If you want to run locally, follow the instructions of Module 0 - Running the notebooks locally
Archives are available on the archive-2020 branch.