About

Experiments combining interior-point and augmented Lagrangian solvers with cone constraints.

Non-convex solver

This is an infeasible-start, primal-dual augmented-Lagrangian interior-point solver for non-convex optimization problems. The solver implements key algorithms from Ipopt to handle inequality constraints and a novel primal-dual augmented Lagrangian is employed for equality constraints. Conic constraints, including second-order and positive semidefinite cones, are coming soon.

Problems of the following form,

minimize        f(x)
   x
subject to      cI(x) >= 0
                cE(x)  = 0
                cA(x)  = 0
                xL <= x <= xU

can be solved.

Features

Augmented Lagrangian:

  • penalty feedback update

Cones:

  • second-order cone
  • positive semidefinite cone

The following algorithms from Ipopt are implemented:

  • barrier update (eq. 7)
  • fraction to boundary (eqs. 8, 15)
  • \alpha, \alpha_z update (eq. 14)
  • z reset (eq. 16)
  • smaller, symmetric linear solve (eq. 13)
  • line-search filter (Algorithm A)
  • filter update (eq. 22)
  • \alpha min (eq. 23)
  • second-order corrections
  • inertia correction
  • acceleration heuristics (sec. 3.2)
    • Case 1:
    • Case 2: watchdog
  • feasibility restoration phase
  • KKT error reduction
  • relaxed bounds (eq. 35) -[X] implemented for restoration mode
  • primal initialization (sec. 3.6)
  • equality constraint multiplier initialization (eq. 36)
  • single bounds damping -[X] implemented for restoration mode
  • automatic scaling of problem statement
  • small search directions
  • iterative refinement on fullspace system
    • tolerance
  • round-off error acceptance criteria relaxation
  • MA57
    • default pivot tolerance

TODO:

  • replace MA57 solver
  • restoration mode cleanup
    • indices/views
    • single bound damping
  • iterative refinement rejection
  • restoration-free version
  • Quasi-Newton
  • BFGS
  • L-BFGS
  • block-wise BFGS
  • Schur-complement linear-system solve

Installation

git clone https://github.com/thowell/non-convex_solver

Request HSL MA57 license and install HSL.jl

Example

Wachter problem

minimize        x1
x1,x2,x3
subject to      x1^2 - x2 - 1.0 = 0
                x1   - x3 - 0.5 = 0
                x2, x3 >= 0

Implementation

# problem dimensions
n = 3 # decision variables
m = 2 # constraints

# initial guess
x0 = [-2.0, 3.0, 1.0]

# bounds
xL = -Inf*ones(n) # lower bounds
xL[2] = 0.0
xL[3] = 0.0
xU = Inf*ones(n) # upper bounds

# objective
f_func(x) = x[1]

# constraints
c_func(x) = [x[1]^2 - x[2] - 1.0;
             x[1]   - x[3] - 0.5]

# model
model = Model(n,m,xL,xU,f_func,c_func)

# options
opts = Options{Float64}()

# solver
s = NCSolver(x0,model,opts=opts)

# solve
solve!(s)

# solution
x = get_solution(s) # x* = [1.0, 0.0, 0.5]