Keras Gradient Accumulation

Travis Coverage

This repo is outdated and will no longer be maintained.

Install

pip install git+https://github.com/cyberzhg/keras-gradient-accumulation.git

Usage

Wrapper

from keras_gradient_accumulation import GradientAccumulation

optimizer = GradientAccumulation('adam', accumulation_steps=8)

Adam

from keras_gradient_accumulation import AdamAccumulated

optimizer = AdamAccumulated(accumulation_steps=8)

Known Issues

  • Not available for batch normalization
  • Not compatible with OptimizerV2