Resetting Batch Norm Statsistics for pre trained Convnext or anyother model #1378
Unanswered
gadithyaraju21
asked this question in
Q&A
Replies: 1 comment
-
@gadithyaraju21 convnext doesn't have running stats because it uses LayerNorm, you can iterate over the model modules and re-init the affine weight/bias manually if you want... for a batchnorm network you can do something similar for weight/bias and you can also reset the running stats for m in model.modules():
if isinstance(m, nn.LayerNorm): # or whatever norm instances are in the model
nn.init.zeros_(m.bias)
nn.init.ones_(m.weight) for m in model.modules():
if isinstance(m, nn.BatchNorm2d):
nn.init.zeros_(m.running_mean)
nn.init.ones_(m.running_var)
nn.init.zeros_(m.num_batches_tracked) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi @rwightman!
I was wondering how I can reset the mean and variance of maybe all or few batch norm layers in prepferably convnet or for instance any other network to their default maybe 0 mean and 1 variance. I see the --bn-momentum and --bn-eps as part of the args in the train file, but seems like it works only with efficient net architectures.
Beta Was this translation helpful? Give feedback.
All reactions