An open source AutoML toolkit for automate machine learning lifecycle, i...
Efficient AI Backbones including GhostNet, TNT and MLP, developed by Hua...
Awesome Knowledge Distillation
Pretrained language model and its related optimization techniques develo...
An Automatic Model Compression (AutoMC) framework for developing smaller...
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A curated list of neural network pruning resources.
micronet, a model compression and deploy lib. compression: 1、quantizati...
A PyTorch implementation for exploring deep and shallow knowledge distil...
PaddleSlim is an open-source library for deep model compression and arch...
Pytorch implementation of various Knowledge Distillation (KD) methods.
A toolkit to optimize ML models for deployment for Keras and TensorFlow,...
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Efficient computing methods developed by Huawei Noah's Ark Lab
Channel Pruning for Accelerating Very Deep Neural Networks (ICCV'17)