Skip to content

Distributed Optimization Methods for Machine Learning

Notifications You must be signed in to change notification settings

Team-60/Distributed-ML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

42 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Distributed-ML

Implementing an optimizer (Nesterov SGD) for training a CNN model on the CIFAR-10 dataset in the following settings:

  • Shared memory Hogwild! | Directory: hogwild
  • Distributed Local-SGD | Directory: Local-SGD

Directory optimizer-benchmarks contains benchmarks for various first-order and second-order based GD methods:

  • SGD
  • Momentum SGD
  • Nesterov SGD
  • Adagrad
  • RMSProp
  • ADAM