Skip to content

Latest commit

 

History

History
16 lines (11 loc) · 529 Bytes

README.md

File metadata and controls

16 lines (11 loc) · 529 Bytes

Distributed-ML

Implementing an optimizer (Nesterov SGD) for training a CNN model on the CIFAR-10 dataset in the following settings:

  • Shared memory Hogwild! | Directory: hogwild
  • Distributed Local-SGD | Directory: Local-SGD

Directory optimizer-benchmarks contains benchmarks for various first-order and second-order based GD methods:

  • SGD
  • Momentum SGD
  • Nesterov SGD
  • Adagrad
  • RMSProp
  • ADAM