DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal K...
Distillation of BERT model with catalyst framework
Adaptive, interpretable wavelets across domains (NeurIPS 2021)
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
Knowledge Base Embedding By Cooperative Knowledge Distillation
Our open source implementation of MiniLMv2 (https://aclanthology.org/202...
this is roberta wwm base distilled model which was distilled from robert...
[ECCV 2020] Code release for "Resolution Switchable Networks for Runtime...
Interpretable and efficient predictors using pre-trained language models...
Distilling Task-Specific Knowledge from Teacher Model into BiLSTM
:scissors: Dataset Culling: Faster training of domain specific models w...