Skip to content

Formed a multilayered generic Feed Forward Neural Network with a custom number of hidden layers and a custom number of neurons in each layer to solve the data recognition on MNIST dataset. Implemented the Back Propagation algorithm with support for Softmax, Sigmoid and ReLu layers.

Notifications You must be signed in to change notification settings

RitvikKapila/Generic-Feed-Forward-Neural-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Generic Feed Forward Neural Network

Developed python library implementing neural networks with support for custom number of hidden layers & neurons in each layer. Implemented back propogation from strach for updating the weight parameters of the model. We are given an MNIST dataset to train our model. The MNIST database is a large database of handwritten digits that is commonly used for training various image processing systems. Studied various hyperparameters such as learning rate, batch size, activation function, number of hidden layers, number of neurons in each layer and their effects on the model performance.

Resources

Chapter 6, Neural Networks - A Classroom Approach by Satish Kumar Additional References:

  1. Understanding backpropagation
  2. NNets and backpropagation

Alternative Approach (Using Computation Graph):

  1. Backpropagation and Neural Networks using Computation Graph

Extra topics studied for the assignment

  1. Understanding Overfitting and underfitting using a complete example
  2. Overfitting and Underfitting
  3. Regularization

About

Formed a multilayered generic Feed Forward Neural Network with a custom number of hidden layers and a custom number of neurons in each layer to solve the data recognition on MNIST dataset. Implemented the Back Propagation algorithm with support for Softmax, Sigmoid and ReLu layers.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages