Skip to content

[GreenFOMO@ECCV2024] How green is continual learning, really? Analyzing the energy consumption in continual training of vision foundation models

License

Notifications You must be signed in to change notification settings

CodingTomo/how-green-continual-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

17 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

How green is continual learning, really? Analyzing the Energy Consumption in Continual Training of Vision Foundation Models (GreenFOMO@ECCV2024)

alt text

This repository contains the official code for the spotlight paper "How Green is Continual Learning, Really? Analyzing the Energy Consumption in Continual Training of Vision Foundation Models", presented at the GreenFOMO Workshop (ECCV 2024). The paper explores the environmental impact of continual training in vision foundation models, providing benchmarks on their energy consumption and offering insights into their sustainability. (ArXiv)

Authors

This project builds on PILOT, integrating energy tracking using CodeCarbon to measure carbon emissions and energy usage during model training.

Installation

  1. Clone the repository:
   git clone https://github.com/CodingTomo/how-green-continual-learning.git
  1. Install dependencies from PILOT following its instructions.
  2. To track energy consumption, install the CodeCarbon package:
pip install codecarbon

For troubleshooting CodeCarbon, refer to its official repository.

Usage

To run training experiments with energy tracking:

python main.py --config exps/METHOD_NAME.yaml

Modify METHOD_NAME.yaml to switch between different continual learning methods.

Dataset

To replicate the paper experiments on DN4IL, the dn_split folder contains the splits used. Place these files in the dataset directory before training.

Output

  • The logs folder contains three CSV files that report the training energy consumption of the CPU, GPU, and memory at different levels of granularity: per epoch, per task, and total consumption. An additional CSV file reports the inference energy consumption for 10,000 requests.

  • The METHOD_NAME_gpu_inference_time.npy file reports the inference time for each of the 10,000 requests after the final incremental training step.

  • The per_step_incremental_accuracy.txt file reports the accuracy per step after each task, while another average_incremental_accuracy.txt file reports the average accuracy during the incremental training process.

Results

Comparison in terms of training energy consumption (x-axis) and accuracy after the final incremental step (y-axis) across benchmarks and task sequence lengths. alt text

Citing

If you use this repository in your research, please cite the following:

@misc{trinci2024greencontinuallearningreally,
      title={How green is continual learning, really? Analyzing the energy consumption in continual training of vision foundation models}, 
      author={Tomaso Trinci and Simone Magistri and Roberto Verdecchia and Andrew D. Bagdanov},
      year={2024},
      eprint={2409.18664},
      archivePrefix={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2409.18664}, 
}
@article{zhou2024continual,
  title={Continual learning with pre-trained models: A survey},
  author={Zhou, Da-Wei and Sun, Hai-Long and Ning, Jingyi and Ye, Han-Jia and Zhan, De-Chuan},
  journal={arXiv preprint arXiv:2401.16386},
  year={2024}
}

License

This project is licensed under the MIT License. See the LICENSE file for more information.

Contact

For any questions or issues, please open an issue in this repository.

About

[GreenFOMO@ECCV2024] How green is continual learning, really? Analyzing the energy consumption in continual training of vision foundation models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages