Skip to content

New generation with new features and documentation

Compare
Choose a tag to compare
@yoshitomo-matsubara yoshitomo-matsubara released this 06 Nov 06:24
· 140 commits to main since this release
335d3c1

torchdistill v1.0.0 Release Notes

This major release supports PyTorch 2.0 and contains a lot of new features, documentation support, and breaking changes.

PyYAML configurations and executable scripts with torchdistill <= v0.3.3 should be considered "legacy" and are no longer supported by torchdistill >= v1.0.0. New PyYAML configurations and executable scripts are provided for the major release.

This release adds support for Python 3.10 and 3.11, and Python 3.7 is no longer supported.

Documentation

Dependencies & Instantiation

  • Add getattr constructor (PR #325)
  • Make package arg optional (PR #322)
  • Enable dynamic module import/get/call (PR #319)
  • Add a function to import dependencies e.g., to register modules (PR #265)

Module registry

  • Add *args (PR #345)
  • Fix default value-related issues (PR #327)
  • No longer use lowered keys (PR #326, #332)
  • Disable lowering by default (PR #323)
  • Rename type/name key (PR #312)
  • Rename registry dicts and arguments for registry key (PR #269)
  • Raise errors when requested module keys are not registered (PR #263)
  • Enable naming modules to be registered (PR #262)

Distillation/Training boxes

  • Remove default forward_proc for transparency (PR #417)
  • Rename a forward_proc function (PR #414)
  • Simplify (D)DP wrapper init (PR #410)
  • Change the timing to print model setup info (PR #335)
  • Add an option to specify find_unused_parameters for DDP (PR #334)
  • Do not touch teacher model by default (PR #333)
  • Training box does not have to inherit nn.Module class (PR #317)
  • Add interfaces package to core (PR #310)
  • Update forward interfaces (PR #307, #308)
  • Rename post_process post_epoch_process for consistency (PR #306)
  • Consider CosineAnnealingWarmRestarts in default post-epoch process functions (PR #305)
  • Make some common procedures in training box registrable/replaceable (PR #304)
  • Introduce {pre,post}-{epoch,forward} processes and registries (PR #274)
  • Rename post_forward functions (PR #272)
  • Make loss as kwarg (PR #273)

Forward hooks

  • Fix initialization issues in IO dict for SELF_MODULE_PATH (PR #328)

Dataset modules

  • Redesign split_dataset and remove unused functions (PR #360)
  • Update CRD dataset wrapper (PR #352)
  • Fix a bug (PR #351)
  • Add default args and kwargs (PR #347)
  • Add get_dataset (PR #324)

Loss modules

  • Fix a typo (PR #413, #415)
  • Add doc artifacts and an option to pass pre-instantiated loss module (PR #399)
  • Add DictLossWrapper (PR #337)
  • Rename an old function name PR #309)
  • Rename single loss middle-level loss (PR #300)
  • Explicitly define criterion wrapper (PR #298)
  • Change concepts of OrgLoss and org_term (PR #296)
  • Rename loss-related classes and functions (PR #294)
  • Add default forward process function and KDLoss back as a single loss (PR #275)
  • Remove org loss module and introduce self-module path (PR #271)

Model modules

  • Support parameter operations (Discussion #387, PR #388)
  • Replace pretrained with weights (PR #354)

Auxiliary model wrapper modules

  • Add find_unused_parameters arg (PR #340)
  • Rename special in configs to auxiliary_model_wrapper (PR #291)
  • Rename special module for clarity (PR #276)

Optimizer/Scheduler modules

  • Fix bugs around optimizer/scheduler (PR #358)
  • epoch arg is deprecated for some LR schedulers (PR #338)

Examples

  • Revert legacy file paths to non-legacy ones (PR #419)
  • Update kwargs and scripts (PR #382)
  • Update yaml util and sample configs (CIFAR-10, CIFAR-100) for the next major release (PR #361)
  • Update sample script and configs (GLUE) for the next major release (PR #259)
  • --log was replaced with --run_log (PR #350)
  • dst_ckpt should be used when using -test_only (PR #349)
  • Simplify the semantic segmentation script (PR #339)
  • Move hardcoded-torchvision-specific code to local custom package (PR #331)
  • Update world_size, cudnn configs, and checkpoint message (PR #330)
  • Rename log argument due to the (abstract) conflict with torchrun (PR #329)
  • Restructure examples and export some example-specific packages (PR #320)
  • Add an option to disable torch.backend.cudnn.benchmark (PR #316)
  • Support stage-wise loading/saving checkpoints (PR #315)
  • Support src_ckpt and dst_ckpt for initialization and saving checkpoints respectively (PR #314)
  • Use legacy configs and scripts tentatively (PR #292, #295)
  • Add legacy examples and configs (PR #289)

Configs

  • Declare forward_proc explicitly (PR #416)
  • Add configs used in NLP-OSS 2023 paper (PR #407)
  • Fix value based on log (PR #284)
  • Update sample configs (ILSVRC 2012, COCO 2017, and PASCAL VOC 2012) for the next major release (PR #357)
  • Update official configs for the next major release (PR #355)
  • Merge single_/multi_stage directories (PR #346)
  • Rename variables (PR #344)
  • Rename "factor" "weight" (PR #302)
  • Restructure criterion (PR #301)
  • Consistently use "params" to indicate learnable parameters, not hyperparameters (PR #297)

Misc.

  • Add Google Analytics ID (PR #406)
  • Add sitemap.xml (PR #405)
  • Update timm repo (PR #375)
  • Add acknowledgments (PR #369)
  • Update file paths (PR #356)
  • Fix a typo and replace pretrained with weights (PR #353)
  • Remove the dict option as it is not intuitive for building transform(s) (PR #303)
  • Temporarily remove registry test (PR #293)
  • Add an important notice (PR #286)
  • Add read permission for content, following the new template (PR #284)
  • Refactor (PRs #268, #270, #283, #343)
  • Update README (PRs #252, #290, #299, #341, #342, #348, #364, #400, #409, #418)
  • Update versions (PRs #251, #391, #420)

Workflows

  • Add a GitHub Action for deploying Sphinx documentation (PR #404)