/FO-PROX-first-order-and-proximal-methods-convergence-comparison

Implementation and brief comparison of different First Order and different Proximal gradient methods, comparison of their convergence rates

Primary LanguagePython

📉FO_ _Prox GitHub commit activity GitHub last commit GitHub code size GitHub repo size GitHub follow GitHub fork GitHub watchers GitHub star

Implementation of different first order (FO) methods and proximal gradient methods: Gradient descent (GD), Gradient descent strongly convex (GDstr), Accelerated gradient descent (AGD), Accelerated gradient descent strongly convex (AGDstr), Accelerated gradient descent with restart (AGDR), Adaptive Gradient Method (AdaGrad), Stochastic gradient descent (SGD), Stochastic averaging gradient (SAG), Stochastic gradient descent with variance reduction (SVR), Subgradient method (SubG), Iterative shrinkage-thresholding algorithm (ISTA), Fast iterative shrinkage-thresholding algorithm (and with restart) (FISTA and FISTAR), Stochastic proximal gradient method (prox_sg).

For a more detailed explanation of the terms mentioned above, please read Exercise instructions.pdf, it contains also some theoretical questions answered in Answers.pdf (handwritten).

The project was part of an assignment for the EPFL course EE-556 Mathematics of data: from theory to computation. The backbone of the code structure to run the experiments was already given by the professor and his assistants, what I had to do was to implement the core of the optimization steps, which are the FO and proximal methods algorithms and other minor components. Hence, every code file is a combination of my personal code and the code that was given us by the professor.

The following image shows an example of the output figures of the code run_comparison.py. All plots show a comparison of convergence speed for different optimization methods, the y-axis shows how far the model is from its optimum (the smaller, the less far, the better)

Immagine 2022-08-05 155002 Immagine 2022-08-05 155002 Immagine 2022-08-05 155002

The following image shows an example of the output figures of the code run_theoretical_vs_emprical_conv_rate.py. It's a comparison between the theoretical upper bound of the optimization method and the actual empirical convergence rate.

Immagine 2022-08-05 155002 Immagine 2022-08-05 155002

Author

How to install and reproduce results

Download this repository as a zip file and extract it into a folder The easiest way to run the code is to install Anaconda 3 distribution (available for Windows, macOS and Linux). To do so, follow the guidelines from the official website (select python of version 3): https://www.anaconda.com/download/

Additional package required are:

  • matplotlib
  • sklearn
  • scipy

To install them write the following command on Anaconda Prompt (anaconda3):

cd *THE_FOLDER_PATH_WHERE_YOU_DOWNLOADED_AND_EXTRACTED_THIS_REPOSITORY*

Then write for each of the mentioned packages:

conda install *PACKAGE_NAME*

Some packages might require more complex installation procedures. If the above command doesn't work for a package, just google "How to install PACKAGE_NAME on YOUR_MACHINE'S_OS" and follow those guides.

Finally, run run_comparison.py and run_theoretical_vs_emprical_conv_rate.py.

python run_comparison.py
python run_theoretical_vs_emprical_conv_rate.py

Files description

  • code/figs/ : folder containing outplot plots produced by the code

  • code/dataset/: folder containing the dataset to train and test on

  • code/core: folder all the code to run the training and testing, included the optimization algorithms

  • run_comparison.py: main code to run the training and testing

  • run_theoretical_vs_emprical_conv_rate.py: main code to run the training and testing

  • Answers.pdf: pdf with the answers and plots to the assignment of the course

  • Exercise instructions.pdf: pdf with the questions of the assignment of the course

🛠 Skills

Python, Matplotlib. Machine learning, proximal methods, theoretical Analysis knowledge of convergence rates, implementation of optimization methods.

🔗 Links

portfolio linkedin