[Ipopt] Hessian, gradient, and objective function have common terms
Filip Jorissen
filip.jorissen at kuleuven.be
Tue Nov 27 17:31:26 EST 2018
This suggestion makes sense to me, although it may cause additional overhead in case IPOPT wants to e.g. evaluate the Jacobian only. This could however be indicated by IPOPT using some flag.
Another option would be to set up your functions such that they internally cache the computed common variables and communicate them to each other, somehow.. But that sounds cumbersome.
Filip
Op 27 nov. 2018, om 22:39 heeft R zu <rzu512 at gmail.com<mailto:rzu512 at gmail.com>> het volgende geschreven:
Sorry, I typed that in haste (~ 1 minute). I was doing derivative over products. Product rule means terms repeated.
I wish Ipopt allow me to provide a function that provides the hessian, objective, and gradient in the same function.
That means I only need to calculate the common term for once, and plug it into formula of all three (hessian...).
However, currently I need to provide a function for each of the hessian, objective, and gradient.
Within each of the 3 functions, I need to calculate the common term for once.
That means the common term is probably calculated for 3 times. I worry that compiler optimization might not be that good and floating point operations are not exactly commutative.
On Tue, Nov 27, 2018 at 2:15 PM R zu <rzu512 at gmail.com<mailto:rzu512 at gmail.com>> wrote:
The difficulty is in programming. But here is an example.
Example objective function f:
f(vector x) = g(vector x)h(vector x)
where:
N N
g(vector x) = Sum sum c_n c_m x_n
n=1 m=1
N
h(vetor x) = Sum c_n c_m x_n
n=1
Gradient calculated by product rule:
f ' (x) = g ' (x) h(x) + g(x) + h' (x)
The function g(x) and h(x) are common between f and f '.
On Tue, Nov 27, 2018 at 1:59 PM Chintan Pathak <cp84 at uw.edu<mailto:cp84 at uw.edu>> wrote:
Dear R Zu,
Might be helpful if you give a small example demonstrating your usecase. For example, are the common terms dependent on 'x', etc.
Thanks
https://about.me/chintanpathak
On Tue, Nov 27, 2018 at 9:55 AM R zu <rzu512 at gmail.com<mailto:rzu512 at gmail.com>> wrote:
Hi.
- The hessian, gradient, and objective function have some common terms.
- The common terms depends on the variables of the objective function.
I calculate each common terms for three times because I need the term for Hessian, gradient, and objective function.
Is it possible to only calculate each common term for only once in each step of optimization?
Thank you.
_______________________________________________
Ipopt mailing list
Ipopt at list.coin-or.org<mailto:Ipopt at list.coin-or.org>
https://list.coin-or.org/mailman/listinfo/ipopt
_______________________________________________
Ipopt mailing list
Ipopt at list.coin-or.org<mailto:Ipopt at list.coin-or.org>
https://list.coin-or.org/mailman/listinfo/ipopt
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/ipopt/attachments/20181127/f3d0aa38/attachment.html>
More information about the Ipopt
mailing list