A research library for pytorch-based neural network pruning, compression, and more.
The new release contains code for a new paper with accompanying code, comparison methods, models, and datasets.
In addition to the previous papers that were covered by this codebase (ALDS, PFP, SiPP, Lost), we also extended the repository to include our latest paper on pruning neural ODEs , which was presented at NeurIPS 2021:
Sparse Flows: Pruning Continuous-depth Models
Check out the READMEs for more info.
The new release contains code for a new paper with accompanying code, comparison methods, models, and datasets.
In addition to the previous papers that were covered by this codebase (PFP, SiPP, Lost), we also extended the repository to include our latest paper on pruning, which will be presented at NeurIPS 2021:
Compressing Neural Networks: Towards Determining the Optimal Layer-wise Decomposition
Check out the READMEs for more info.
Detailed release update:
ALDS
algorithm in torchprune
.ALDS
.The new release contains major overhauls and improvements to the code base.
In addition to the previous two papers that were covered by this code base (PFP and SiPP), we also extended the code base to include our latest paper on pruning presented at MLSys 2021:
Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy
Check out the READMEs for more info.
Bug fixes, visualization updates, better logging, improved readability, simplified compression sub-module
There was a bug in distributed training when using more than one GPU causing training to stall at the end of the last epoch.
This is the version of the code as originally published for the ICLR'20 paper Provable Filter Pruning for Efficient Neural Networks.