You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
def freeze_bn(module):
if isinstance(module,nn.BatchNorm2d):
module.eval()
classname = module.__class__.__name__
if classname.find('BatchNorm') != -1:
for p in module.parameters(): p.requires_grad=False
Since module.eval() has frozen bn, why you additionally set p.requires_grad=False? Is there another module called BatchNorm*?
The text was updated successfully, but these errors were encountered:
Thank you for your early reply. I have learned that turn off the requires_grad can accelerate this module, but when we do inference, we always add with torch.no_grad to do the same thing, as it is in line 153 of eval.py. So does it means that here we turn off requires_grad in freeze_bn is for some situations in the training process, or just to make the code robust?
I`m learning pytorch with this code, so some questions may be stupid, please forgive me.
in FCOS.Pytorch/model/fcos.py
Since module.eval() has frozen bn, why you additionally set
p.requires_grad=False
? Is there another module called BatchNorm*?The text was updated successfully, but these errors were encountered: