[Ipopt] Ipopt for Convex Problems

Bailin Deng bailin.deng at epfl.ch
Mon May 12 16:45:17 EDT 2014


Hi Andreas,

Thanks for your explanation. Could you explain in more detail why Ipopt cannot find KKT multipliers? I have doubt about this, because of the following reasons:

1) Since this is a convex problem, we can use Slater's condition as constraint qualification, so LICQ is not necessary. 

2) Your paper "On the Implementation of an Interior-Point Filter Line-Search Algorithm for Large-Scale Nonlinear Programming" mentions that Ipopt transforms inequality constraints to equality constraints using slack variables subject to bound constraints. For our constraints, the new equality constraint looks like

    x_1^2 + x_2^2 - x_3^3 + s = 0,

where s is a nonnegative slack variable. When computing gradients with respect to all variables {X, S}, these gradients are linear independent, since each cone constraint has a different slack variable. 

Thank you for your help.

Best,
Bailin
________________________________________
From: ipopt-bounces at list.coin-or.org [ipopt-bounces at list.coin-or.org] on behalf of Andreas Waechter [awaechter.iems at gmail.com]
Sent: Monday, May 12, 2014 3:52 PM
To: ipopt at list.coin-or.org
Subject: Re: [Ipopt] Ipopt for Convex Problems

Bailin,

The problem with you constraint (1) is that its gradient will be zero if
all of the variables are zero at the optimal solution, and in that case
there will be no KKT multipliers for the optimal solution (the linear
independence constraint qualification fails) and Ipopt, which is looking
for those multipliers, will probably fail, or at least perform poorly.
This is why there are specialized methods for these second-order cone
constraints.

Andreas Waechter

Associate Professor
Department of Industrial Engineering and Management Sciences
McCormick School of Engineering
Northwestern University
Evanston, IL 60208
USA
http://users.iems.northwestern.edu/~andreasw/

On 05/11/2014 06:51 AM, Bailin Deng wrote:
> Hi all,
>
> A few years back, on the mailing list there was a post  (http://list.coin-or.org/pipermail/ipopt/2009-November/001733.html) saying that Ipopt can find the global optimal solution for the following problem if f(x), d(x) are convex and c(x) is linear:
>
> min  f(x)
> s.t. c(x) = 0
>        d(x) <= 0
>
>
> Now if d(x) is not convex, but d(x) <=0 defines a convex feasible set, this is still a convex optimization problem. Can Ipopt still find the global minimum? One example is the following constraints:
>
> x_1^2 + x_2^2 - x_3^2 <=0,     (1)
> x_3 >=0.                                  (2)
>
> This defines a quadratic cone, which is convex. But the function x_1^2 + x_2^2 - x_3^2 itself is not convex.
>
>
> Interestingly, we can also write these constraints as
>
> \sqrt{ x_1^2 + x_2^2 } - x_3 <=0.
>
> In this case, function \sqrt{ x_1^2 + x_2^2 } - x_3 is convex, but it is not differentiable at points where x_1 = x_2 = 0. So to use it in Ipopt, we would wnat to write the constraints in the form of Equations (1) and (2). But then it is unclear to me whether Ipopt will still find the global minimum. Thanks a lot for your help!
>
>
> Best,
> Bailin
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt

_______________________________________________
Ipopt mailing list
Ipopt at list.coin-or.org
http://list.coin-or.org/mailman/listinfo/ipopt



More information about the Ipopt mailing list