[Ipopt] Low # iters, ensuring that solution remains feasible

John Schulman john.d.schulman at gmail.com
Thu Nov 13 01:47:54 EST 2014


Hi Tony,

Funny to get a reply from someone so close by after launching a message out
to the colossal internet.

I wasn't warm-starting at all--that's a great suggestion. Hopefully the
Lagrange multipliers won't change too much between iterations. Otherwise,
there may be some problem-specific ways to guess a reasonable Lagrange
multiplier here.

Regarding speed & algorithm choice, BFGS seems like the right fit here, and
in this problem it's not really practical to compute the hessian (which is
dense).
But I'll be sure to check out print_timing_statistics -- I'm new to Ipopt
and haven't found these handy tools yet.

John


On Wed, Nov 12, 2014 at 10:19 PM, Tony Kelman <kelman at berkeley.edu> wrote:

>   Hi John, good to see groups next door also using Ipopt.
>
> Generally speaking this is a pretty hard thing to do with an
> interior-point method. Are you warm-starting the dual variables as well, or
> just the primal? That may help, but it depends how closely related the
> subproblems are. I’d also avoid doing quasi-newton hessian approximations
> if you have a speed-critical application, you’ll get better convergence in
> most cases if you are using a modeling tool that can provide exact
> Hessians. Have you looked at the breakdown of computation time from
> print_timing_statistics?
>
> -Tony
>
>
>  *From:* John Schulman <john.d.schulman at gmail.com>
> *Sent:* Wednesday, November 12, 2014 10:14 PM
> *To:* ipopt at list.coin-or.org
> *Subject:* Re: [Ipopt] Low # iters, ensuring that solution remains
> feasible
>
>  Oops, "wildly feasible" in the first paragraph should be "wildly
> infeasible"
>
> On Wed, Nov 12, 2014 at 10:06 PM, John Schulman <john.d.schulman at gmail.com
> > wrote:
>
>>  Short:
>>
>> I am calling Ipopt repeatedly to solve a series of subproblems.
>> For each subproblem, Ipopt is initialized with a feasible solution, and
>> max_iter is set to 50 or so.
>> The optimization terminates early, and often this intermediate solution
>> is wildly feasible.
>> I'm wondering if there are any settings that will ensure that the result
>> is nearly feasible.
>>
>> Longer:
>>
>> I am using Ipopt to solve a series of subproblems of the form
>> minimize f(x), subject to g(x) < delta,
>> Here g is a distance function of sorts, measuring Distance(x_0,x), where
>> x_0 is the initialization.
>> So the the initial point x_0 is feasible.
>> x has dimension 50000 or so, so I am using hessian_approximation with
>> limited memory.
>>
>> I need to keep to a low number of iterations, say 50 or 100, so the
>> overall computation time remains reasonable.
>> It's not essential at all that the solution generated is optimal; I just
>> want to improve the objective as much as possible while remaining feasible.
>>
>> I tried fiddling with the barrier parameters but didn't have any luck.
>> Any suggestions?
>> Thanks in advance for your time.
>>
>> John
>>
>>
>>
>>
>>
>
>
> ------------------------------
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/ipopt/attachments/20141112/00fdcdab/attachment.html>


More information about the Ipopt mailing list