An open source AutoML toolkit for automate machine learning lifecycle, i...
Efficient AI Backbones including GhostNet, TNT and MLP, developed by Hua...
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques develo...
An Automatic Model Compression (AutoMC) framework for developing smaller...
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
micronet, a model compression and deploy lib. compression: 1、quantizati...
A curated list of neural network pruning resources.
A PyTorch implementation for exploring deep and shallow knowledge distil...
PaddleSlim is an open-source library for deep model compression and arch...
A toolkit to optimize ML models for deployment for Keras and TensorFlow,...
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Pytorch implementation of various Knowledge Distillation (KD) methods.
Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)
Efficient computing methods developed by Huawei Noah's Ark Lab