Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Training is not working. Loss gets worse #2

Open
kalle264 opened this issue Jun 17, 2021 · 1 comment
Open

Training is not working. Loss gets worse #2

kalle264 opened this issue Jun 17, 2021 · 1 comment

Comments

@kalle264
Copy link

Hi,
did you use your implementation successfully for training?

It's working for me when I just calculate the loss for a prediction of a pretrained model, but when I want to train with it, the loss increases from batch to batch and then just stays at the diagonal distance for every following epoch.

My code works just fine with a different loss.
Do you have any idea what could cause that problem?

@N0vel
Copy link
Owner

N0vel commented Jun 17, 2021

Hi,
Try to tune learning rate and the parameters of the loss itself (it's a weighted sum of two components)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants