The code presents the implementation of Fixup as an option for standard Wide ResNet. When BatchNorm and Fixup are enabled simultaneously, Fixup initialization and the standard structure of the residual block are used.
Usage example:
python train.py --layers 40 --widen-factor 10 --batchnorm False --fixup True
Wide Residual Network by Sergey Zagoruyko and Nikos Komodakis
Fixup Initialization: Residual Learning Without Normalization by Hongyi Zhang, Yann N. Dauphin, Tengyu Ma
Fixup implementation was originally introduced by Andy Brock
WRN code by xternalz