Skip to content

Releases: FluxML/Optimisers.jl

v0.4.2

11 Dec 16:08
669798c
Compare
Choose a tag to compare

Optimisers v0.4.2

Diff since v0.4.1

Merged pull requests:

Closed issues:

  • Optimiser state a not moving to GPU (#179)
  • AdamW: epsilon and lambda swapped? (#197)

v0.4.1

08 Nov 21:23
2639523
Compare
Choose a tag to compare

Optimisers v0.4.1

Diff since v0.4.0

Merged pull requests:

v0.4.0

07 Nov 07:38
38c9d62
Compare
Choose a tag to compare

Optimisers v0.4.0

Diff since v0.3.4

Merged pull requests:

Closed issues:

  • Stable docs will 404 until a new version is tagged (#25)
  • Allow keyword arguments for optimisers (#74)
  • doc improvement: working with custom model types (#84)
  • Rename or outsource iswriteable (#99)
  • Split out the rules.jl as a sub-package (or a separate package) ? (#108)
  • Wrong model update for BatchNorm for some specific synthax (#123)
  • Use OptChain as an alias for OptimiserChain? (#138)
  • nothing does not correspond to updating the state with a zero gradient. (#140)
  • Utility for walking a tree (e.g. gradients) w.r.t. a model (#143)
  • Adam optimizer can produce NaNs with Float16 due to small epsilon (#167)
  • mark as public any non-exported but documented interface (#189)

v0.3.4

05 Nov 05:53
0ae05d6
Compare
Choose a tag to compare

Optimisers v0.3.4

Diff since v0.3.3

Merged pull requests:

Closed issues:

  • Design (#1)
  • convenience constructors (#11)
  • Restructure is not type stable but could be made stable? (#177)
  • AdamW optimizer implemented incorrectly - weight decay does not incorporate learning rate (#182)

v0.3.3

09 Apr 02:27
c2ae321
Compare
Choose a tag to compare

Optimisers v0.3.3

Diff since v0.3.2

Merged pull requests:

Closed issues:

  • Documenter CI is failing (#169)

v0.3.2

08 Feb 04:27
1908a1c
Compare
Choose a tag to compare

Optimisers v0.3.2

Diff since v0.3.1

Merged pull requests:

  • WeightDecay for L1 norm (#159) (@mcabbott)
  • Add all-keyword constructors, much like @kwdef (#160) (@mcabbott)
  • CompatHelper: add new compat entry for Statistics at version 1, (keep existing compat) (#164) (@github-actions[bot])
  • Don't load Yota at all (#166) (@mcabbott)

v0.3.1

05 Sep 05:33
1cd1e87
Compare
Choose a tag to compare

Optimisers v0.3.1

Diff since v0.3.0

Closed issues:

  • Error in update! for Metal arrays and Adam optimiser (#150)

Merged pull requests:

v0.3.0

25 Aug 14:56
6a4f948
Compare
Choose a tag to compare

Optimisers v0.3.0

The major change is #151, which removes type parameterisation from the structs. This should not break straightforward user code, but may break break loading via BSON etc. It also adds errors on negative learning rates, and will in some cases change the default regulator from eps(Float32) to 1e-8.

Diff since v0.2.20

Closed issues:

  • Adam(0) fails (#119)
  • Error in update! for Metal arrays and Adam optimiser (#150)

Merged pull requests:

v0.2.20

20 Aug 00:46
322a6bb
Compare
Choose a tag to compare

Optimisers v0.2.20

Diff since v0.2.19

Closed issues:

  • Implement Lion, up to 5x faster than Adam, and more accurate (#156)

Merged pull requests:

v0.2.19

25 Jul 03:52
6eaf26d
Compare
Choose a tag to compare

Optimisers v0.2.19

Diff since v0.2.18

Closed issues:

  • OptimiserChain(..., ClipNorm) fails on GPU (#127)

Merged pull requests: