Skip to content

Recommended way to freeze batch norm? #866

Answered by rwightman
alexander-soare asked this question in Q&A

You must be logged in to vote

@alexander-soare Generally, recursively walking the modules and replacing them using helper like this https://detectron2.readthedocs.io/en/latest/_modules/detectron2/layers/batch_norm.html#FrozenBatchNorm2d.convert_frozen_batchnorm
...

Same way as done for syncbatchnorm (https://github.com/rwightman/pytorch-image-models/blob/master/train.py#L396) and my own split_bn helper (https://github.com/rwightman/pytorch-image-models/blob/54e90e82a5a6367d468e4f6dd5982715e4e20a72/timm/models/layers/split_batchnorm.py#L41)

In that way you can (sort of) track the layers and enable/disable the freeze at certain points in the model, actually being able to use the metadata for feature extraction to locate…

Replies: 4 comments 5 replies

You must be logged in to vote
1 reply
@alexander-soare

Answer selected by alexander-soare

You must be logged in to vote
0 replies

You must be logged in to vote
3 replies
@rwightman

@alexander-soare

@alexander-soare

You must be logged in to vote
1 reply
@alexander-soare

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants