Knowledge Distillation Pytorch Resources Save

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

No resources for this project.

Add resource

Open Source Agenda Badge

Open Source Agenda Rating
Submit Resource Articles, Courses, Videos