code for deep learning courses
Code and notebooks for the deep learning course dataflowr. Here is the schedule followed at école polytechnique in 2023:
- Module 1 - Introduction & General Overview Slides + notebook Dogs and Cats with VGG + Practicals (more dogs and cats)
- you do not need to understand everything to run a deep learning model! But the main goal of this course will be to come back to each step done today and understand them...
- to use the dataloader from Pytorch, you need to follow the API (i.e. for classification store your dataset in folders)
- using a pretrained model and modifying it to adapt it to a similar task is easy.
- if you do not understand why we take this loss, that's fine, we'll cover that in Module 3.
- even with a GPU, avoid unnecessary computations!
- Module 2a - PyTorch tensors
- Module 2b - Automatic differentiation + Practicals
- MLP from scratch start of HW1
- another look at autodiff with dual numbers and Julia
- Pytorch tensors = Numpy on GPU + gradients!
- in deep learning, broadcasting is used everywhere. The rules are the same as for Numpy.
- Automatic differentiation is not only the chain rule! Backpropagation algorithm (or dual numbers) is a clever algorithm to implement automatic differentiation...
- Module 3 - Loss function for classification
- Module 4 - Optimization for deep learning
- Module 5 - Stacking layers and overfitting a MLP on CIFAR10: Stacking_layers_MLP_CIFAR10.ipynb
- Module 6: Convolutional neural network
- how to regularize with dropout and uncertainty estimation with MC Dropout: Module 15 - Dropout
- Loss vs Accuracy. Know your loss for a classification task!
- know your optimizer (Module 4)
- know how to build a neural net with torch.nn.module (Module 5)
- know how to use convolution and pooling layers (kernel, stride, padding)
- know how to use dropout
- Module 7 - Dataloading
- Module 8a - Embedding layers
- Module 8b - Collaborative filtering and build your own recommender system: 08_collaborative_filtering_empty.ipynb (on a larger dataset 08_collaborative_filtering_1M.ipynb)
- Module 8c - Word2vec and build your own word embedding 08_Word2vec_pytorch_empty.ipynb
- Module 16 - Batchnorm and check your understanding with 16_simple_batchnorm_eval.ipynb and more 16_batchnorm_simple.ipynb
- Module 17 - Resnets
- start of Homework 2: Class Activation Map and adversarial examples
- know how to use dataloader
- to deal with categorical variables in deep learning, use embeddings
- in the case of word embedding, starting in an unsupervised setting, we built a supervised task (i.e. predicting central / context words in a window) and learned the representation thanks to negative sampling
- know your batchnorm
- architectures with skip connections allows deeper models
- Module 9a: Autoencoders and code your noisy autoencoder 09_AE_NoisyAE.ipynb
- Module 10: Generative Adversarial Networks and code your GAN, Conditional GAN and InfoGAN 10_GAN_double_moon.ipynb
- Module 13: Siamese Networks and Representation Learning
- start of Homework 3: VAE for MNIST clustering and generation
- Module 12 - Attention and Transformers
- Correcting the PyTorch tutorial on attention in seq2seq: 12_seq2seq_attention.ipynb
- Build your own microGPT: GPT_hist.ipynb
- Module 9b - UNets
- Module 9c - Flows
- Build your own Real NVP: Normalizing_flows_empty.ipynb
- Module 18a - Denoising Diffusion Probabilistic Models
- Train your own DDPM on MNIST: ddpm_nano_empty.ipynb
- Finetuning on CIFAR10: ddpm_micro_sol.ipynb
Module 1: Introduction & General Overview
Module 2: Pytorch tensors and automatic differentiation
Module 3: Loss functions for classification
Module 4: Optimization for deep leaning
Module 6: Convolutional neural network
Module 8: Embedding layers, Collaborative filtering and Word2vec
Module 10 - Generative Adversarial Networks
Module 11 - Recurrent Neural Networks and Batches with sequences in Pytorch
Module 12 - Attention and Transformers
Module 13 - Siamese Networks and Representation Learning
Module 18a - Denoising Diffusion Probabilistic Models
Module - Deep Learning on graphs
NERF
If you want to run locally, follow the instructions of Module 0 - Running the notebooks locally
Archives are available on the archive-2020 branch.