Awesome Knowledge Distillation
Pytorch implementation of various Knowledge Distillation (KD) methods.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature ...
PyContinual (An Easy and Extendible Framework for Continual Learning)
Code and dataset for ACL2018 paper "Exploiting Document Knowledge for As...
Knowledge Transfer via Distillation of Activation Boundaries Formed by H...
Code and pretrained models for paper: Data-Free Adversarial Distillation
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)