[ADOL-C] Evaluating Hessian of Lagrangian

Norman Goldstein normvcr at telus.net
Tue Apr 9 05:04:20 EDT 2013


Sorry.  By the "kernel", I mean the kernel of the Jacobian of f(x):

All u in R^n such that   df/dx * u = 0

These u form the tangent space of the constraint "surface".
You could, for example, use SVD to find an orthonormal basis of the kernel.

Cheers,
Norm


On 04/09/2013 01:11 AM, Ingrid Hallen wrote:
> Ok, I think I kind of understand the idea. However, doesn't it require
> that one solves equations of the form f(x) = 0,  where f define
> some of the constraints, for each iterate x to identify the kernel?
> It seems related to the idea of eliminating equality constraints (as
> in e.g. p. 132 in Convex Optimization by Boyd and Vandenberge),
> which, as I did not mention :), I don't want to do.
>
>
> Kind regards,
>
> Ingrid
>
>
>
>
>
>
> ------------------------------------------------------------------------
> Date: Mon, 8 Apr 2013 13:14:39 -0700
> From: normvcr at telus.net
> To: ingridhallen at hotmail.com
> CC: adol-c at list.coin-or.org
> Subject: Re: [ADOL-C] Evaluating Hessian of Lagrangian
>
> For the second point, specially since there are almost as many
> constraints as variables, it may be worthwhile to take the Hessian
> with respect to only the directions in the kernel of the constraints.
> For example, if you are working in 20-dimensional Euclidean space,
> with 18 constraints, that leaves an essentially 2-dimensional optimization
> problem, and you only really need a 2x2 Hessian, not a 20x20 Hessian.
> You can choose two orthogonal directions in the kernel of the constraints,
> and construct the 2x2 Hessian of Lagrangian wrt them, viewing this either
> as directional derivatives or change of coordinates.
>
> I don't know what amount of SW changes this would need, but it
> seems it would be generally useful.
>
> Cheers,
> Norm
>
>
> On 04/08/2013 06:38 AM, Ingrid Hallen wrote:
>
>     Thanks for the suggestions!
>
>     Regarding the first, I think it might not be so efficient for my
>     problem, having
>     almost as many constraints as variables. But I might give it a
>     try. Perhaps
>     one can modify the code for sparse_hess in some clever way ...
>
>     Regarding the second I am unfortunately clueless as to how one could
>     exploit that fact.
>
>     Kind regards,
>
>     Ingrid
>
>
>     ------------------------------------------------------------------------
>     Date: Sun, 7 Apr 2013 15:31:50 -0700
>     From: normvcr at telus.net <mailto:normvcr at telus.net>
>     To: adol-c at list.coin-or.org <mailto:adol-c at list.coin-or.org>
>     Subject: Re: [ADOL-C] Evaluating Hessian of Lagrangian
>
>     Perhaps you can trace L as a function of both x and lambda.
>     Then when you need the hessian,
>     calculate the hessian only with respect to x.
>     Not sure if this is better/worse than what you suggested.
>
>     When using the Hessian of the Lagrangian, you will eventually be
>     restricting
>     attention to the kernel of your constraints.  Do you know if this fact
>     would offer a simplification to how the hessian could be computed?
>
>     Norm
>
>
>     On 04/05/2013 04:02 AM, Ingrid Hallen wrote:
>
>         Hi,
>
>         I'm doing non-linear optimization with IPOPT. For this, I'm
>         using ADOL-C
>         to compute the Hessian of the Lagrangian
>
>         L(x,lambda) = f(x) + sum_{i}lambda_{i}h_{i}(x),
>
>         where x are the variables, lambda the Lagrange multipliers and
>         f(x) and h_{i}(x) objective and constraint functions.
>
>         What I'm doing in my code is the following (omitting details):
>
>         // **********************
>
>         // Trace Lagrangian function
>         trace_on(tag);
>
>         for(i=0;i<n;i++) {
>             xad[i] <<= x[i];
>         }
>
>         Lagrangian(xad, lambda);
>
>         Lad >>=L;
>
>         trace_off();
>
>         // Evaluate Hessian of the Lagrangian
>         repeat = 0;
>         sparse_hess(tag,n,repeat,x,&nnz,&rind,&cind,&values,&options)
>
>         // ***********************
>
>         This works fine, but is not so efficient. One reason is that,
>         since lambda changes,
>         the Lagrangian function has to be retaped every time the
>         Hessian is needed and so it
>         appears that I cannot set repeat = 1 when calling sparse_hess.
>
>         One way to circumvent this problem could perhaps be to trace
>         the objective
>         and constraint functions individually and then construct the
>         Hessian of
>         the Lagrangian using multiple calls to sparse_hess, but is there a
>         more convenient way to do it?
>
>         Sincerely,
>
>         Ingrid
>
>
>
>         _______________________________________________
>         ADOL-C mailing list
>         ADOL-C at list.coin-or.org  <mailto:ADOL-C at list.coin-or.org>
>         http://list.coin-or.org/mailman/listinfo/adol-c
>
>
>
>     _______________________________________________ ADOL-C mailing
>     list ADOL-C at list.coin-or.org <mailto:ADOL-C at list.coin-or.org>
>     http://list.coin-or.org/mailman/listinfo/adol-c
>
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/adol-c/attachments/20130409/1260e8b5/attachment.html>


More information about the ADOL-C mailing list