PyCIL: A Python Toolbox for Class-Incremental Learning
Introduction • Methods Reproduced • Reproduced Results • How To Use • License • Acknowledgments • Contact
Welcome to PyCIL, perhaps the toolbox for class-incremental learning with the most implemented methods. This is the code repository for "PyCIL: A Python Toolbox for Class-Incremental Learning" [paper] in PyTorch. If you use any content of this repo for your work, please cite the following bib entry:
@article{zhou2023pycil,
author = {Da-Wei Zhou and Fu-Yun Wang and Han-Jia Ye and De-Chuan Zhan},
title = {PyCIL: a Python toolbox for class-incremental learning},
journal = {SCIENCE CHINA Information Sciences},
year = {2023},
volume = {66},
number = {9},
pages = {197101-},
doi = {https://doi.org/10.1007/s11432-022-3600-y}
}
@article{zhou2023class,
author = {Zhou, Da-Wei and Wang, Qi-Wei and Qi, Zhi-Hong and Ye, Han-Jia and Zhan, De-Chuan and Liu, Ziwei},
title = {Deep Class-Incremental Learning: A Survey},
journal = {arXiv preprint arXiv:2302.03648},
year = {2023}
}
@article{zhou2024continual,
title={Continual Learning with Pre-Trained Models: A Survey},
author={Zhou, Da-Wei and Sun, Hai-Long and Ning, Jingyi and Ye, Han-Jia and Zhan, De-Chuan},
journal={arXiv preprint arXiv:2401.16386},
year={2024}
}
Traditional machine learning systems are deployed under the closed-world setting, which requires the entire training data before the offline training process. However, real-world applications often face the incoming new classes, and a model should incorporate them continually. The learning paradigm is called Class-Incremental Learning (CIL). We propose a Python toolbox that implements several key algorithms for class-incremental learning to ease the burden of researchers in the machine learning community. The toolbox contains implementations of a number of founding works of CIL, such as EWC and iCaRL, but also provides current state-of-the-art algorithms that can be used for conducting novel fundamental research. This toolbox, named PyCIL for Python Class-Incremental Learning, is open source with an MIT license.
For more information about incremental learning, you can refer to these reading materials:
FineTune
: Baseline method which simply updates parameters on new tasks.EWC
: Overcoming catastrophic forgetting in neural networks. PNAS2017 [paper]LwF
: Learning without Forgetting. ECCV2016 [paper]Replay
: Baseline method with exemplar replay.GEM
: Gradient Episodic Memory for Continual Learning. NIPS2017 [paper]iCaRL
: Incremental Classifier and Representation Learning. CVPR2017 [paper]BiC
: Large Scale Incremental Learning. CVPR2019 [paper]WA
: Maintaining Discrimination and Fairness in Class Incremental Learning. CVPR2020 [paper]PODNet
: PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning. ECCV2020 [paper]DER
: DER: Dynamically Expandable Representation for Class Incremental Learning. CVPR2021 [paper]PASS
: Prototype Augmentation and Self-Supervision for Incremental Learning. CVPR2021 [paper]RMM
: RMM: Reinforced Memory Management for Class-Incremental Learning. NeurIPS2021 [paper]IL2A
: Class-Incremental Learning via Dual Augmentation. NeurIPS2021 [paper]SSRE
: Self-Sustaining Representation Expansion for Non-Exemplar Class-Incremental Learning. CVPR2022 [paper]FeTrIL
: Feature Translation for Exemplar-Free Class-Incremental Learning. WACV2023 [paper]Coil
: Co-Transport for Class-Incremental Learning. ACM MM2021 [paper]FOSTER
: Feature Boosting and Compression for Class-incremental Learning. ECCV 2022 [paper]MEMO
: A Model or 603 Exemplars: Towards Memory-Efficient Class-Incremental Learning. ICLR 2023 Spotlight [paper]BEEF
: BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion. ICLR 2023 [paper]SimpleCIL
: Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need. arXiv 2023 [paper]More experimental details and results can be found in our survey.
Clone this GitHub repository:
git clone https://github.com/G-U-N/PyCIL.git
cd PyCIL
[MODEL NAME].json
file for global settings.[MODEL NAME].py
file (e.g., models/icarl.py
).python main.py --config=./exps/[MODEL NAME].json
where [MODEL NAME] should be chosen from finetune
, ewc
, lwf
, replay
, gem
, icarl
, bic
, wa
, podnet
, der
, etc.
hyper-parameters
When using PyCIL, you can edit the global parameters and algorithm-specific hyper-parameter in the corresponding json file.
These parameters include:
ResNet32
is utilized for CIFAR100
, and ResNet18
is used for ImageNet
.Other parameters in terms of model optimization, e.g., batch size, optimization epoch, learning rate, learning rate decay, weight decay, milestone, and temperature, can be modified in the corresponding Python file.
We have implemented the pre-processing of CIFAR100
, imagenet100,
and imagenet1000
. When training on CIFAR100
, this framework will automatically download it. When training on imagenet100/1000
, you should specify the folder of your dataset in utils/data.py
.
def download_data(self):
assert 0,"You should specify the folder of your dataset"
train_dir = '[DATA-PATH]/train/'
test_dir = '[DATA-PATH]/val/'
Here is the file list of ImageNet100 (or say ImageNet-Sub).
Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning (CVPR 2024) [paper] [code]
Continual Learning with Pre-Trained Models: A Survey (IJCAI 2024) [paper] [code]
Deep Class-Incremental Learning: A Survey (arXiv 2023) [paper] [code]
Learning without Forgetting for Vision-Language Models (arXiv 2023) [paper]
Revisiting Class-Incremental Learning with Pre-Trained Models: Generalizability and Adaptivity are All You Need (arXiv 2023) [paper] [code]
PILOT: A Pre-Trained Model-Based Continual Learning Toolbox (arXiv 2023) [paper] [code]
Few-Shot Class-Incremental Learning via Training-Free Prototype Calibration (NeurIPS 2023)[paper] [Code]
BEEF: Bi-Compatible Class-Incremental Learning via Energy-Based Expansion and Fusion (ICLR 2023) [paper] [code]
A model or 603 exemplars: Towards memory-efficient class-incremental learning (ICLR 2023) [paper] [code]
Few-shot class-incremental learning by sampling multi-phase tasks (TPAMI 2022) [paper] [code]
Foster: Feature Boosting and Compression for Class-incremental Learning (ECCV 2022) [paper] [code]
Forward compatible few-shot class-incremental learning (CVPR 2022) [paper] [code]
Co-Transport for Class-Incremental Learning (ACM MM 2021) [paper] [code]
Towards Realistic Evaluation of Industrial Continual Learning Scenarios with an Emphasis on Energy Consumption and Computational Footprint (ICCV 2023) [paper][code]
Dynamic Residual Classifier for Class Incremental Learning (ICCV 2023) [paper][code]
S-Prompts Learning with Pre-trained Transformers: An Occam's Razor for Domain Incremental Learning (NeurIPS 2022) [paper] [code]
Please check the MIT license that is listed in this repository.
We thank the following repos providing helpful components/functions in our work.
The training flow and data configurations are based on Continual-Learning-Reproduce. The original information of the repo is available in the base branch.
If there are any questions, please feel free to propose new features by opening an issue or contact with the author: Da-Wei Zhou([email protected]) and Fu-Yun Wang([email protected]). Enjoy the code.