torch-optimizer -- collection of optimizers for Pytorch
Unofficial implementation of Switching from Adam to SGD optimization in ...