Distiller Versions Save

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

v0.3.1

5 years ago

Tagging the 'master' branch before performing a few API-breaking changes.

v0.3.0

5 years ago
  • Supports PyTorch 1.0.1
  • Supports installation as a Python package
  • Many features (TBD) since release v0.2.0 (PyTorch 0.4)

v0.2.0

5 years ago
  • PyTorch 0.4 support
  • An implementation of Baidu's RNN pruning paper from ICLR 2017 Narang, Sharan & Diamos, Gregory & Sengupta, Shubho & Elsen, Erich. (2017). Exploring Sparsity in Recurrent Neural Networks. (https://arxiv.org/abs/1704.05119)
  • Add a word language model pruning example using AGP and Baidu RNN pruning
  • Quantization aware training (4-bit quantization)
  • New models: pre-activation ResNet for ImageNet and CIFAR, and AlexNet with batch-norm
  • New quantization documentation content

v0.1.0

5 years ago

We're tagging this version which uses PyTorch 0.3, and we want to want to move the 'master' branch to support PyTorch 0.4 and its API changes.