[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
https://user-images.githubusercontent.com/6366788/213662770-5f814de0-cbe8-48d9-8235-e8907fd81e0e.mp4
Basically, we're using a lot of "Morph" transitions between slides. We first make a slide, duplicate it, and then move some object on the second slide to different places, or add some new objects. Using "Morph" transition between the two slides can automatically animate and move those changed or added object, in a pretty smooth way.
The PowerPoint files are attached.