[Ipopt] Modification of the example hs071_c: case without constraints

tmtuan at laas.fr tmtuan at laas.fr
Tue Mar 24 18:30:47 EDT 2009


Hi all again,

Thinking about a simple example of non-constraint optimization with Ipopt,
I just modified the example hs071_c.c by setting m=0, nele_jac=0,
nele_hess=0. I also removed the part /* set the values of the constraint
bounds */ and modified the gradient functions as follows:

Bool eval_g(...)
{
  return TRUE;
}

Bool eval_jac_g(...)
{
  return TRUE;
}

Bool eval_h(...)
{
   return FALSE; // using approximated hessian in option
}

In this case, the obvious optimum is x1 = x2 = x3 = x4 = 1 which gives 4
as optimal value for the objective function.

However, I got this result:

=====
...
 533  4.0498668e+00 0.00e+00 5.13e+09  -1.7 1.39e+14 -20.0 2.81e-14
2.81e-16w  1
 534  4.0693547e+00 0.00e+00 2.04e+08  -1.7 4.51e+23 -20.0 1.68e-21
4.16e-23f 14
Warning: Cutting back alpha due to evaluation error
...
Restoration phase is called at point that is almost feasible,
  with constraint violation 0.000000e+00. Abort.

Number of Iterations....: 534

                                   (scaled)                 (unscaled)
Objective...............:   4.0693546701813865e+00    4.0693546701813865e+00
Dual infeasibility......:   2.0382053748458204e+08    2.0382053748458204e+08
Constraint violation....:   0.0000000000000000e+00    0.0000000000000000e+00
Complementarity.........:   2.0581959617183902e+06    2.0581959617183902e+06
Overall NLP error.......:   7.9999998816327331e+02    2.0382053748458204e+08


Number of objective function evaluations             = 4138
Number of objective gradient evaluations             = 535
Number of equality constraint evaluations            = 0
Number of inequality constraint evaluations          = 0
Number of equality constraint Jacobian evaluations   = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations             = 0
Total CPU secs in IPOPT (w/o function evaluations)   =      0.968
Total CPU secs in NLP function evaluations           =      0.007

EXIT: Restoration Failed!
=====

After 534 iterations, the optimal cost converged to 4 but no explicit
optimal values X1->X4 was found.

When I changed initial values to X=(1,2,2,1), I got this:

=====
  84  5.3833428e+00 0.00e+00 4.25e+00  -1.3 6.72e+05 -20.0 2.22e-08
2.40e-08f  8
  85  4.9866527e+00 0.00e+00 8.89e-01  -1.3 9.29e-01 -20.0 1.00e+00
6.54e-02f  1
ERROR: Problem in step computation, but emergency mode cannot be activated.

Number of Iterations....: 85

                                   (scaled)                 (unscaled)
Objective...............:   4.9866526748311717e+00    4.9866526748311717e+00
Dual infeasibility......:   8.8935433599997316e-01    8.8935433599997316e-01
Constraint violation....:   0.0000000000000000e+00    0.0000000000000000e+00
Complementarity.........:   4.5557198993150250e-01    4.5557198993150250e-01
Overall NLP error.......:   8.8935433599997316e-01    8.8935433599997316e-01


Number of objective function evaluations             = 394
Number of objective gradient evaluations             = 86
Number of equality constraint evaluations            = 0
Number of inequality constraint evaluations          = 0
Number of equality constraint Jacobian evaluations   = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations             = 0
Total CPU secs in IPOPT (w/o function evaluations)   =      0.198
Total CPU secs in NLP function evaluations           =      0.001

EXIT: Error in step computation (regularization becomes too large?)!
=====

After 85 iterations, the program converged to 5.

So, could you please explain me how to overcome this simple problem ? How
to "design" a non-constraint optimization program which works ?

Thanks a lot,
Minh








More information about the Ipopt mailing list