Releases: rakutentech/stAdv
Releases · rakutentech/stAdv
v0.2.1
Add the possibility to provide grad_op
to the optimization routine stadv.optimization.lbfgs
, preventing growth of the graph. Not providing grad_op
to this optimization routine will result in a memory leak in case of repeated calls.
v0.2.0
Fix calculation of flow_loss
to match Eq. (4) from arXiv:1801.02612.
Previously the summation over p (for looping over all pixels) was done inside of the square root, and not outside.