/Optimisation-for-Machine-Learning

This repository contains the Labs and Assignments done as a part of coursework for the course Optimisation for Machine Learning CSL4010 under Prof. Md Abu Talhamainuddin Ansary

Primary LanguageJupyter NotebookMIT LicenseMIT

Optimisation-for-Machine-Learning

This repository contains the Labs and Assignments done as a part of coursework for the course Optimisation for Machine Learning CSL4010 under Prof. Md Abu Talhamainuddin Ansary

The labs covered a range of topics related to optimization for machine learning, including linear and non-linear optimization, convex optimization, and integer programming. Here is a brief overview of each topic:

  • Linear Programming Problem (LPP): LPP is a method of solving optimization problems where the objective function and constraints are linear. It is used to find the optimal solution to a problem where the goal is to maximize or minimize a linear function subject to constraints represented by linear equations or inequalities.
  • Network Flow: Network flow is a method of solving problems related to the flow of goods, services, or information through a network of nodes and edges. It is used to optimize the flow of resources through a network, such as transportation or logistics systems, and can be formulated as a linear or non-linear optimization problem.
  • Quadratic Problems: Quadratic problems are optimization problems where the objective function is a quadratic function. They are used to find the minimum or maximum of a quadratic function subject to constraints represented by linear equations or inequalities.
  • Steepest Descent Method: The steepest descent method is an optimization algorithm that is used to find the minimum of a differentiable function. It is a first-order optimization method that iteratively moves in the direction of the negative gradient of the function.
  • Dual Problems: Dual problems are related to the optimization of a given problem by considering the dual problem. The dual problem is obtained by changing the objective function and constraints of the original problem, and its solution provides information about the optimal solution of the original problem.
  • SVM Hyperplanes: Support Vector Machines (SVMs) are a type of supervised learning algorithm that can be used for classification and regression. The optimization problem for an SVM involves finding the hyperplane that maximally separates the data into different classes.
  • Descent Methods: Descent methods are a class of optimization algorithms that iteratively move towards the minimum of a function by following the negative gradient of the function. They are used to find the minimum of a differentiable function.
  • Stochastic Gradient: Stochastic gradient is an optimization algorithm that is used to find the minimum of a differentiable function. It is a first-order optimization method that iteratively moves in the direction of the negative gradient of the function.
  • Newton Method: Newton's method is an optimization algorithm that is used to find the minimum of a differentiable function. It is a second-order optimization method that iteratively moves in the direction of the negative Hessian of the function.
  • Non-Linear Optimization Problem: Non-linear optimization problems are optimization problems where the objective function and constraints are non-linear. They are used to find the optimal solution to a problem where the goal is to maximize or minimize a non-linear function subject to constraints represented by non-linear equations or inequalities.
  • Integer Programming Problems: Integer programming problems are optimization problems where some or all of the variables are restricted to be integers. They are used to find the optimal solution to a problem where the goal is to maximize or minimize a linear or non-linear function subject to constraints represented by linear or non-linear equations or inequalities, and where some or all of the variables are restricted to be integers.