Skip to content

v0.4.1

Compare
Choose a tag to compare
@andreped andreped released this 20 Apr 17:36
· 171 commits to main since this release
1e1746a

What's Changed

New API

You can now use gradient accumulation with the AccumBatchNormalization layer:

from gradient_accumulator import GradientAccumulateModel, AccumBatchNormalization
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense

# define model and add accum BN layer
model = Sequential()
model.add(Dense(32, activation="relu"))
model.add(AccumBatchNormalization(accum_steps=8))
model.add(Dense(10))

# add gradient accumulation to the rest of the model
model = GradientAccumulateModel(accum_steps=8, inputs=model.input, outputs=model.output)

More information about remarks and usage can be found at gradientaccumulator.readthedocs.io

Full Changelog: v0.4.0...v0.4.1