[Ipopt] number of calls to objectives and constraints and/or their gradients
Andreas Waechter
andreasw at watson.ibm.com
Wed Nov 3 16:36:19 EDT 2010
Hi Drosos,
I believe it may happen that the constraint Jacobian may be requested
without that Ipopt asks for the gradient of the obejctive function; this
may happen during the restoration phase. I haven't checked this, however,
and you could simply find out yourself by adding print statements in your
eval_grad_f and eval_jac_g methods. If so, there is not really an easy to
way to know if the value of only one set of derivaitives will be
requested.
But if I understand you correctly, your code is set up so that the
computation of only the objective gradient takes about the same time as
the computation of both the objective gradient and the constraint
Jacobian. In this case, you can just always compute both when either
method is called, and you can cache the result for the other method and
return it without having to recompute it in case it is called later. The
new_x flag is probably helps, since it will tell you when to invalidate
your cache. Note that the new_x flag can be set to true also for other
evaluation methods (e.g., eval_f) if such a method is called first with a
new x, and you would need to make sure you invalidate your cache also when
any evaluation method is called with new_x=true, not only eval_grad_f or
eval_jac_g.
I hope this helps,
Andreas
On Wed, 3 Nov 2010, Drosos Kourounis wrote:
> Dear Andreas,
> I would like to know, if during constrained optimization with a large
> number of nonlinear constraints, IPOPT may happen to request the
> gradient of the objective only, or the gradient of the constraints only.
> I do hope this never happens, and IPOPT always requires the gradient of
> both the objective and all constraints simultaneously. I understand that
> the values (eval_f, eval_g) may be called separately. What about the
> gradients though?
>
> If it may happen that eval_grad_f may be called without the need of
> calling eval_grad_g at the same time, or the other way round, is there
> any flag to prevent that from happening, and what would be the
> implications if such a flag is enabled?
>
> I am asking because I can obtain the gradient of all the constraints and
> objective simultaneously at practically the same computational cost as
> the gradient of only the objective when using PARDISO.
>
> Best wishes,
> Drosos.
>
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt
>
>
More information about the Ipopt
mailing list