Neural Network Distiller by Intel AI Lab: a Python package for neural ne...
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A PyTorch-based knowledge distillation toolkit for natural language proc...
PaddleSlim is an open-source library for deep model compression and arch...
Pytorch implementation of various Knowledge Distillation (KD) methods.
PyTorch implementation of various methods for continual learning (XdG, E...
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer B...
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet w...
Prompt engineering for developers
Segmind Distilled diffusion
A Python library for adversarial machine learning focusing on benchmarki...
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joi...
BERT distillation(基于BERT的蒸馏实验 )