This repository contains materials related to optimization methods, including a homework assignment and a project, with a focus on various optimization techniques.
In the "homework" directory, you will find a homework assignment that addresses a semi-supervised learning classification problem. The primary objectives of this assignment are:
- Implement three unique optimization methods: Gradient Descent, Randomized Block Coordinate Gradient Descent (RBCGD), and Block Coordinate Gradient Descent with Gaussian Southwell (BCGDGS).
- Evaluate and compare the performances of these optimization methods on a randomly chosen dataset.
- Apply these methods to a specifically chosen dataset to determine which method works best for a given scenario.
The assignment explores the effectiveness of these optimization methods in the context of semi-supervised learning classification.
In the "project" directory, you will find a project focused on implementing the Frank-Wolfe optimization algorithm with the L1 ball as a feasible region. The project's main objectives are:
- Implementation of the Frank-Wolfe optimization algorithm.
- Utilizing the L1 ball as a feasible region in the optimization process.
- Comparing the performance of the Frank-Wolfe algorithm with the Stochastic Gradient Descent (SGD) method.
The project aims to demonstrate the capabilities of the Frank-Wolfe optimization method and assess its performance in comparison to SGD.