Skip to content

HaiminZhang/fixup-Initialization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Wide Residual Network with optional Fixup initialization

The code presents the implementation of Fixup as an option for standard Wide ResNet. When BatchNorm and Fixup are enabled simultaneously, Fixup initialization and the standard structure of the residual block are used.

Usage example:

python train.py --layers 40 --widen-factor 10 --batchnorm False --fixup True

Acknowledgment

Wide Residual Network by Sergey Zagoruyko and Nikos Komodakis

Fixup Initialization: Residual Learning Without Normalization by Hongyi Zhang, Yann N. Dauphin, Tengyu Ma

Fixup implementation was originally introduced by Andy Brock

WRN code by xternalz

About

Fixup initialization implementation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages