This repository collects papers for "A Survey on Knowledge Distillation ...
Training & evaluation library for text-based neural re-ranking and dense...
[ICLR 2023] "More ConvNets in the 2020s: Scaling up Kernels Beyond 51x51...
Using Teacher Assistants to Improve Knowledge Distillation: https://arxi...
FasterAI: Prune and Distill your models with FastAI and PyTorch
SlimSAM: 0.1% Data Makes Segment Anything Slim
[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
A large scale study of Knowledge Distillation.
An Object Detection Knowledge Distillation framework powered by pytorch,...
Knowledge distillation in text classification with pytorch. 知识蒸馏,中...
Code for NeurIPS 2022 paper "Knowledge Distillation Improves Graph Struc...
Code for "Lion: Adversarial Distillation of Proprietary Large Language M...
'NKD and USKD' (ICCV 2023) and 'ViTKD'
This repository aims at providing efficient CNNs for Audio Tagging. We p...
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Frame...