Skip to content

Conversation

@nukyan
Copy link

@nukyan nukyan commented Jan 1, 2026

Unlike other create_* functions, create_norm_layer can create normalization layers that have different parameters (such as BatchNorm2d and GroupNorm), which may trigger errors for missing required arguments or passes redundant arguments. So it's reasoning that delete them before create.

In short, I want the following code works correctly whether it's batch normalization or group normalization.

create_norm_layer('GroupNorm', num_features=1, num_groups=2, num_channels=4)

Please notice that this is a breaking change that compromises backward compatibility (the removing of num_features). If it's necessary, I can add it back. Furthermore, it doesn't robustly handle unusual scenarios, though it suffices for all normalization layers in Pytorch and timm.

@nukyan nukyan force-pushed the create_norm_layer branch from 44a1c28 to 1843cc1 Compare January 1, 2026 05:02
@rwightman
Copy link
Collaborator

@nukyan it is not remotely possible to break backwards compatibility like this. As designed compatible kwargs are expected to be passed for a given norm layer. To change the norm layer for a model like efficientformer to groupnorm it'd be best to pass a norm_layer w/ arguments bound by partial

@rwightman rwightman closed this Jan 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants