[Ipopt] Can IPOPT do this

Ned Nedialkov nedialk at mcmaster.ca
Fri Jun 12 16:39:38 EDT 2009


Damien,

On Jun 11, 2009, at 12:09 AM, Damien Hocking wrote:

> Ned,
>
> The basis for IPOPT is a smooth, continuously-differentiable problem
> space.  I'm now going to throw in something here that can give purists
> an enurism.  Are your two sets of constraints "similar", in that their
> problem spaces are reasonably close together, or that you're solving
> mostly the same problem with some differences?

The problem spaces can be close together, but not in general. I have  
to satisfy
constraints in a DAE that has branches in problem formulation. The  
constraints
are sufficiently differentiable is each sub-domain of interest.
>  Examples might be
> turbulent-laminar flow correlations, or a distillation column model  
> with
> different sets of specifications but similar operating conditions.   
> The
> key here is that the problem is about the same shape.  In these  
> sorts of
> cases, you might be able to get IPOPT to find an answer that's good
> enough.  It might not be the global optimum, but it might be a better
> answer with satisfied constraints.  The way I think about the
> transitions between the constraint sets is that each set of  
> constraints
> is generating a new, hopefully better initial estimate or starting  
> point
> for the next set of constraints.

This reminds me of doing continuation.
> This isn't very elegant but it can
> work.

I think so.

> You will have to code the loop that switches the constraint sets
> and drives IPOPT yourself, because you'll be deallocating and  
> rebuilding
> IPOPT instances as it stumbles along, and only preserving the variable
> values.  It will be slow on large systems, because the Jacobian will
> need to be re-factored from scratch each time the constraints change.

Yes, from a software engineering perspective it is quite complicated
to get it to work, but it is feasible. My problems are small, at most
couple of hundred equations, and typically smaller.
>
> If you have a significantly different set of variables for each set of
> constraints then you do need to go to the effort of setting the  
> problem
> up in a MINLP solver like Bonmin.

My variables are nearly the same. Doing MINLP is another option,
in addition to doing continuation.

> As I said, this isn't elegant (it's a hot pink shirt with an orange
> jacket), but it can work.  And there are no guarantees at all, and
> Andreas might kick me off the list forever for suggesting  
> this. ... :-)

I find your posting very useful. Thank you.

Best regards,
Ned
>
>
> Andreas Waechter wrote:
>> Hi Ned,
>>
>> I'm copying your email to the Ipopt mailing list - in general, it  
>> might be
>> a good idea to send questions like this one there, since other people
>> might have better ideas then me...
>>
>> No, Ipopt cannot deal with the situation you describe, it requires
>> functions that are at least once continuously differentiable.   
>> Otherwise,
>> it might get stuck.
>>
>> You can look at methods for non-smooth optimization, or you can maybe
>> model your problem as an MINLP.  If the resulting problem  
>> formulation is
>> convex (unlikely, since your have equality constraints, unless you  
>> can
>> relax them), you can use method like Bonmin or FilMINT, otherwise  
>> there
>> are a number of global optimization codes, such as BARON, Lago, or
>> Couenne.  (Bonmin, Couenne, and Lago are available on COIN.)
>>
>> Regards,
>>
>> Andreas
>>
>> On Mon, 8 Jun 2009, Ned Nedialkov wrote:
>>
>>
>>> Hi Andreas,
>>>
>>> I need to minimize a function subject to constraints that are not
>>> differentiable everywhere.
>>> That is, my constraint function is generally of the sort
>>>
>>> 	if (some condition on x_i's)
>>> 	   F_1(x_1, ..., x_n) = 0
>>> 	else
>>> 	   F_2(x_1, ..., x_n)  = 0
>>>
>>> where I can provide Jacobians for F_1 and F_2. My objective is  
>>> simple, min
>>> ||x-a||_2^2.
>>>
>>> I am wondering if the theory behind IPOPT can deal with this. If  
>>> not, do you
>>> have any idea
>>> what method/software may be applicable.
>>>
>>> Many thanks,
>>> Ned
>>>
>>>
>>>
>>
>> _______________________________________________
>> Ipopt mailing list
>> Ipopt at list.coin-or.org
>> http://list.coin-or.org/mailman/listinfo/ipopt
>>
>>
>
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> http://list.coin-or.org/mailman/listinfo/ipopt



More information about the Ipopt mailing list