[Ipopt] Hessian, gradient, and objective function have common terms
R zu
rzu512 at gmail.com
Tue Nov 27 14:15:19 EST 2018
The difficulty is in programming. But here is an example.
Example objective function f:
f(vector x) = g(vector x)h(vector x)
where:
N N
g(vector x) = Sum sum c_n c_m x_n
n=1 m=1
N
h(vetor x) = Sum c_n c_m x_n
n=1
Gradient calculated by product rule:
f ' (x) = g ' (x) h(x) + g(x) + h' (x)
The function g(x) and h(x) are common between f and f '.
On Tue, Nov 27, 2018 at 1:59 PM Chintan Pathak <cp84 at uw.edu> wrote:
> Dear R Zu,
>
> Might be helpful if you give a small example demonstrating your usecase.
> For example, are the common terms dependent on 'x', etc.
>
> Thanks
>
> https://about.me/chintanpathak
>
>
> On Tue, Nov 27, 2018 at 9:55 AM R zu <rzu512 at gmail.com> wrote:
>
>> Hi.
>>
>> - The hessian, gradient, and objective function have some common terms.
>> - The common terms depends on the variables of the objective function.
>>
>> I calculate each common terms for three times because I need the term for
>> Hessian, gradient, and objective function.
>>
>> Is it possible to only calculate each common term for only once in each
>> step of optimization?
>>
>> Thank you.
>> _______________________________________________
>> Ipopt mailing list
>> Ipopt at list.coin-or.org
>> https://list.coin-or.org/mailman/listinfo/ipopt
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/ipopt/attachments/20181127/da081f2f/attachment.html>
More information about the Ipopt
mailing list