MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition [Official, ICCV 2023, Paper] π₯
Qihao Zhao1,2, Chen Jiang1, Wei Hu1, Fan Zhang1, Jun Liu2
1 Beijing University of Chemical Technology
2 Singapore University of Technology and Design
If you find our work inspiring or use our codebase in your research, please consider giving a star β and a citation.
@InProceedings{Zhao_2023_ICCV,
author = {Zhao, Qihao and Jiang, Chen and Hu, Wei and Zhang, Fan and Liu, Jun},
title = {MDCS: More Diverse Experts with Consistency Self-distillation for Long-tailed Recognition},
booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
month = {October},
year = {2023},
pages = {11597-11608}
}
- run:
python train.py -c configs/config_cifar100_ir100_mdcs.json
- run:
python test.py -r checkpoint_path
- run such as resnext50 400 epochs:
python train.py -c configs/config_imagenet_lt_resnext50_mdcs_e400.json
- run:
python test.py -r checkpoint_path
- run:
python train_places.py -c configs/config_places_lt_resnet152_mdcs.json
- run:
python test_places.py -r checkpoint_path
- run :
python train.py -c configs/config_iNaturalist_resnet50_mdcs.json
- run:
python test.py -r checkpoint_path
- To install requirements:
pip install -r requirements.txt
- Run in linux (may have some problems in windows)
- Please download these datasets and put them to the /data file.
- ImageNet-LT and Places-LT can be found at here.
- iNaturalist data should be the 2018 version from here.
- CIFAR-100 will be downloaded automatically with the dataloader.
data
βββ ImageNet_LT
βΒ Β βββ test
βΒ Β βββ train
βΒ Β βββ val
βββ CIFAR100
βΒ Β βββ cifar-100-python
βββ Place365
βΒ Β βββ data_256
βΒ Β βββ test_256
βΒ Β βββ val_256
βββ iNaturalist
Β Β βββ test2018
βββ train_val2018
- We provide txt files for test-agnostic long-tailed recognition for ImageNet-LT, Places-LT and iNaturalist 2018. CIFAR-100 will be generated automatically with the code.
- For iNaturalist 2018, please unzip the iNaturalist_train.zip.
data_txt
βββ ImageNet_LT
βΒ Β βββ ImageNet_LT_test.txt
βΒ Β βββ ImageNet_LT_train.txt
βΒ Β βββ ImageNet_LT_val.txt
βββ Places_LT_v2
βΒ Β βββ Places_LT_test.txt
βΒ Β βββ Places_LT_train.txt
βΒ Β βββ Places_LT_val.txt
βββ iNaturalist18
βββ iNaturalist18_train.txt
βββ iNaturalist18_uniform.txt
βββ iNaturalist18_val.txt
- For the training on Places-LT, we follow previous methods and use the pre-trained ResNet-152 model.
- Please download the checkpoint. Unzip and move the checkpoint files to /model/pretrained_model_places/.
The mutli-expert framework is based on SADE and RIDE. Strong augmentations are based on NCL and PaCo.