A set of notebooks explore deep learning related topics.
CNN architectures: look into the structures of common CNN architectures, such as ResNet, ResNeXt, SENet, Densenet, Inception V4, WRN, Xception, Dual Path Networks, NASNet, Progressive Neural Architecture Search, VGG, etc., and how to use them in fastai.
EfficientNet paper study: study the official implementation. As the building blocks, the mobile inverted residual blocks and the Squeeze-and-Excitation networks are also studied here.
WGAN paper study: replicate some results in the WGAN paper.
An easy way to do the backward propagation math: use a simple rule to derive the backward propagation for all different kinds of neural networks, such as LSTM, CNN, etc.
Resume interrupted 1cycle policy training: divide the long training process into smaller ones and resume the training.
How the LSTM's memory works?: dig into the LSTM's internal states to see how it manages to generate valid XML texts.
To be continued ...