[Coin-ipopt] IPOPT remains dual infeasible

Andreas Waechter andreasw at watson.ibm.com
Mon Feb 12 19:41:38 EST 2007


Hi Peter,

Many thanks for resending your message after the Ipopt mailing list 
"outage"...

Yes, your suspicion is correct:

The reason, why you see such large values of the dual infeasibility is 
most probably that you are using the quasi-Newton option.

Here is why I think this is happening:

When second derivatives are used, then we are truely applying Newton's 
method to the primal-dual first order optimality conditions (or at least a 
perturbed version thereof).  Here, the constraint multipliers enter in the 
KKT matrix (the one that is factorized to compute the step) only in the 
Hessian of the Lagrangian function.

Now, when we use the (limited memory) BFGS method to approximate the 
Hessian of the Lagrangian function, the multipliers have only a small 
effect on the matrix, and there is no reason they would converge quickly 
to the optimal values.  Well, that is not a 100% mathematical explanation, 
but I think this is essentially what is going on.

I have noticed this before, too, and I have tried some tricks to improve, 
but all just being heuristics up to that point.  The main thing I tried is 
to recompute the constraint multipliers in each iteration as least-square 
estimates (for dual infeasibility) instead of relying on the step given 
from the KKT system (which is what we usually use).  Since this requires 
an extra factorization of the KKT system (with identity instead of the 
Hessian), this is somewhat expensive, and I added a heuristic so that this 
is only done when the constraint violation is small.

The options that regular this behavior are:

---------- 8< --------------------------

recalc_y                      ("no")
    Tells the algorithm to recalculate the equality and inequality multipliers
    as least square estimates.
      This asks the algorithm to recompute the multipliers, whenever the
      current infeasibility is less than recalc_y_feas_tol. Choosing yes might
      be helpful in the quasi-Newton option.  However, each recalculation
      requires an extra factorization of the linear system.  If a limited
      memory quasi-Newton option is chosen, this is used by default.
    Possible values:
     - no                      [use the Newton step to update the multipliers]
     - yes                     [use least-square multiplier estimates]

recalc_y_feas_tol                      0 <  (      1e-06) <  +inf
    Feasibility threshold for recomputation of multipliers.
      If recalc_y is chosen and the current infeasibility is less than this
      value, then the multipliers are recomputed.

---------- 8< --------------------------

Usually, the default for the "recalc_y" option is "no", but when you use 
the limited memory BFGS option, it is chosen to be "yes" (unless you 
overwrite the default).  If "recalc_y" is "yes", then the constraint 
multipliers ("y") are recomputed, whenever the current constraint 
violation is less than recalc_y_feas_tol.

In your example, it seems that they are recomputed in every iteration 
(since inf_pr is always less than recalc_y_feas_tol), and here it doesn't 
seem to help :(


I guess your approach to change the termination criterion is a good one. 
But instead of just relaxing the the dual infeasibility, we might want to 
try a more complicated heuristic, that could monitor the progress in the 
objective function and the change in the primal iterates, and maybe the 
step sizes ("alpha").

At the moment, I probably won't have the time to look into actually 
changing the code.  If you want to try something yourself, the file to 
look at would be Ipopt/src/Algorithm/IpOptErrorConvCheck.cpp (the 
CheckConvergence method).

I hope this helps at least a little bit...

Cheers,

Andreas

PS: MATLAB interface, cool!


On Mon, 12 Feb 2007, Peter Carbonetto wrote:

> THE SCENARIO: I have an optimization problem with about 60 variables
> (later this will grow), a nonlinear objective, bound constraints (all
> the variables must lie in the positive quadrant) and a sparse set of
> linear constraints.
>
> [My problem consists of solving inference in a Bayesian model using a
> variational approximation.]
>
> THE GOOD NEWS: IPOPT converges to a very reasonable solution after about 80 
> iterations, and does so using a limited-memory quasi-Newton approximation to 
> the Hessian (limited_memory_max_history = 5) because I haven't bothered to 
> compute the second derivatives. I know the solution is reasonable because 
> there is an existing algorithm tailored to my problem that comes up with the 
> same answer.
>
> THE BAD NEWS: IPOPT fails to converge to a point that is dual
> feasible, or takes an extremely long time to do so. The output from
> IPOPT looks something like this:
>
> iter    objective    inf_pr   inf_du lg(mu)  ||d||
>   0  9.4872744e+01 2.22e-16 4.89e+02   0.0 0.00e+00
>   1  5.4935737e+01 2.22e-16 7.25e+00  -4.0 2.80e+01
>   2  5.3405603e+01 1.11e-16 5.21e+00  -1.3 4.36e-02
>   3  4.6288683e+01 0.00e+00 2.84e+00  -0.8 3.29e-01
>
>                        . . .
>
>  85  2.6698305e+01 2.22e-16 5.84e-02  -2.6 3.35e-02
>  86  2.6698300e+01 2.22e-16 2.42e-02  -2.6 5.99e-02
>  87  2.6658051e+01 2.22e-16 1.91e+00  -3.9 8.45e-03
>  88  2.6669530e+01 2.22e-16 2.75e-01  -3.1 1.11e-02
>
> Notice that the dual variables have failed to converge to the feasible
> region. Currently, I workaround this problem by setting a high
> tolerance for dual infeasibility (dual_inf_tol = 0.9).
>
> I'm wondering if anyone may have some deeper understanding of
> primal-dual methods than myself and may be able to explain why this is
> going on. I suspect that the problem may have something to do with the 
> quasi-Newton approximation, but I can't be sure since I haven't coded
> the exact Hessian.
>
> [By the way, Andreas, I've done all this using a MATLAB interface I've
> developed. There were a few significant obstacles in getting IPOPT to
> interface with MATLAB, but at this point it is working quite
> nicely. More on that in the not-too-distant future.]
>
> Thank you,
>
> Peter Carbonetto
> Dept. of Computer Science
> University of British Columbia
> _______________________________________________
> Coin-ipopt mailing list
> Coin-ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/coin-ipopt
>



More information about the Coin-ipopt mailing list