v0.4.1
What's Changed
- Added issue templates by @andreped in #59
- Fixed bug in AccumBatchNormalizer - identical results to Keras BN by @andreped in #61
- Docs: Added AccumBN example + docs README + minor fixes by @andreped in #62
- bump v0.4.1 by @andreped in #63
New API
You can now use gradient accumulation with the AccumBatchNormalization
layer:
from gradient_accumulator import GradientAccumulateModel, AccumBatchNormalization
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
# define model and add accum BN layer
model = Sequential()
model.add(Dense(32, activation="relu"))
model.add(AccumBatchNormalization(accum_steps=8))
model.add(Dense(10))
# add gradient accumulation to the rest of the model
model = GradientAccumulateModel(accum_steps=8, inputs=model.input, outputs=model.output)
More information about remarks and usage can be found at gradientaccumulator.readthedocs.io
Full Changelog: v0.4.0...v0.4.1