This repository contains the implementation of the Optimization for DataScience @ UniPD first homework, about the efficiency of block coordinates gradient descent methods.
Gradient descent is a method to find the minimum of a function, however during the course multiple version of it are being shown.
In this analysis, multiple version will be compared, to check the convergence rate of them both in time and in iterations.
In order to implement the project, it's been created a Python Jupyter notebook, containing all the code to recreate any result reported.
Everything can be found inside the project
folder
The essay contains the details of the homework, all the calculations needed to develop the project, and all the explanation on why things are being chosen to be done in this way.
In order to visually see what the algorithms actually do, in the notebook there is the code to generate some GIF at each iteration of all the methods, and the HTML file given is supposed to help to see the difference between them.
Following are some examples:
Current label | Current classification | Current loss |
---|---|---|
If you clone/download the repository and open the README file, you will be able to compare the animations since they will be synchronized, this might not happen if you are watching this from Github