Skip to content

Commit

Permalink
Add a backend optimizer for adafactor.
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 580526748
  • Loading branch information
tensorflower-gardener committed Nov 8, 2023
1 parent 6c36fbc commit 93d954a
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion tf_keras/optimizers/adafactor.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ class Adafactor(optimizer.Optimizer):
The default argument setup is based on the original paper (see reference).
When gradients are of dimension > 2, Adafactor optimizer will delete the
last 2 dimensions separately in its accumulator variables.
Args:
learning_rate: Initial value for the learning rate:
either a floating point value,
Expand Down

0 comments on commit 93d954a

Please sign in to comment.