PyCIL Versions Save

PyCIL: A Python Toolbox for Class-Incremental Learning

v0.2.1

10 months ago

In this update, we support the state-of-the-art methods of 2023:

  • MEMO: A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental Learning. ICLR 2023 Spotlight [paper]
  • BEEF: BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion. ICLR 2023 [paper]
  • SimpleCIL: Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need. arXiv 2023 [paper]

Cheers!

v0.1

1 year ago

This is the first Github release of PyCIL. Reproduced methods are listed as:

  • FineTune: Baseline method which simply updates parameters on new tasks, suffering from Catastrophic Forgetting. By default, weights corresponding to the outputs of previous classes are not updated.
  • EWC: Overcoming catastrophic forgetting in neural networks. PNAS2017 [paper]
  • LwF: Learning without Forgetting. ECCV2016 [paper]
  • Replay: Baseline method with exemplars.
  • GEM: Gradient Episodic Memory for Continual Learning. NIPS2017 [paper]
  • iCaRL: Incremental Classifier and Representation Learning. CVPR2017 [paper]
  • BiC: Large Scale Incremental Learning. CVPR2019 [paper]
  • WA: Maintaining Discrimination and Fairness in Class Incremental Learning. CVPR2020 [paper]
  • PODNet: PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning. ECCV2020 [paper]
  • DER: DER: Dynamically Expandable Representation for Class Incremental Learning. CVPR2021 [paper]
  • PASS: Prototype Augmentation and Self-Supervision for Incremental Learning. CVPR2021 [paper]
  • RMM: RMM: Reinforced Memory Management for Class-Incremental Learning. NeurIPS2021 [paper]
  • IL2A: Class-Incremental Learning via Dual Augmentation. NeurIPS2021 [paper]
  • SSRE: Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning. CVPR2022 [paper]
  • FeTrIL: Feature Translation for Exemplar-Free Class-Incremental Learning. WACV2023 [paper]
  • Coil: Co-Transport for Class-Incremental Learning. ACM MM2021 [paper]
  • FOSTER: Feature Boosting and Compression for Class-incremental Learning. ECCV 2022 [paper]

Stay tuned for more state-of-the-arts in PyCIL!