[Ipopt] C++ Interface and Auxiliary Data, Using BFGS

Tony Kelman kelman at berkeley.edu
Mon Oct 15 06:35:00 EDT 2012


Alireza,

1) The common approach using the C++ interface is to declare your auxiliary 
data as class variables in your NLP class, then all the 
objective/constraint/gradient/etc evaluation functions can access those 
variables. There was a discussion last month showing one use case of this 
(http://list.coin-or.org/pipermail/ipopt/2012-September/003036.html). The 
basic C interface uses a UserDataPtr to a structure as the last input 
argument to all the evaluation functions, as you can see in 
Ipopt/examples/hs071_c/hs071_c.c (look for MyUserData, and user_data).

The auxdata method in the Matlab interface appears to have a memory leak 
(see http://list.coin-or.org/pipermail/ipopt/2011-August/002527.html also 
for a workaround), which could cause performance problems if your auxdata 
structure is large (you can check with "whos" in Matlab). Next time you run 
your Matlab version, watch the memory use of Matlab using "top", see if 
you're getting close to running out of memory.

2) According to the previous page in the tutorial (node31), you shouldn't 
need a Hessian function if you use limited-memory quasi-Newton 
approximations. Not sure whether or not you need a blank placeholder - try 
and find out?

3) You can, but you probably shouldn't. By default, Ipopt reads options from 
a text file called ipopt.opt located in the run directory. This way you can 
modify options at runtime without needing to recompile your application. See 
http://www.coin-or.org/Ipopt/documentation/node55.html

4) There are over a dozen example problems in 
Ipopt/examples/ScalableProblems, these may be useful for you to look at? I 
don't think those problems hard-code option settings, but pretty sure some 
of them are using class-variable auxiliary data for parameters and other 
things.

5) The most important statistics to look at are the "Total CPU secs in 
IPOPT" vs "Total CPU secs in NLP function evaluations." You can get even 
more detailed timing info by setting the option print_timing_statistics to 
yes. As long as you aren't running out of memory or anything else strange in 
Matlab, the IPOPT time is likely to be similar between a Matlab 
implementation and a C++ implementation, assuming the same linear solver and 
Ipopt options, number of iterations, BLAS version, and everything else being 
equal. The NLP function evaluations will almost always be faster in a C++ 
implementation vs a Matlab implementation, but by how much is difficult to 
predict: it depends very much on how complicated the function evaluations 
are, and your coding proficiency in the two languages. While you're still 
working in Matlab, I highly recommend using the Matlab profiler to find what 
parts of your Matlab code are taking most of the time (or if it's the Ipopt 
time, it will be listed for ipopt.mex). 30 minutes sounds like a fairly 
large and/or complicated problem... how many iterations is Ipopt typically 
taking?

-Tony

-----Original Message----- 
From: AliReza Khoshgoftar Monfared <khoshgoftar at gmail.com>
Date: Sun, 14 Oct 2012 21:44:31 -0400
To: ipopt at list.coin-or.org
Subject: [Ipopt] C++ Interface and Auxiliary Data, Using BFGS

Hi,

I have a couple of questions regarding the C++ interface of Ipopt:

1) Is there a convenient way to pass auxiliary data to Ipopt when
interfacing through C++?

I have been using the MatlabInterface previously, and as seen in
"/Ipopt/contrib/MatlabInterface/examples/lasso.m", by assigning a value to
"auxdata" in the solver options, this parameter can be passed and unpacked
in any of the callbacks (objective, gradient, etc...) to access extra
information. What will the equivalent for C++ interface be?

2) If I use "limiteed-memory" mode for "hessian_approximation" option, do I
still need to implement all the functions mentioned in the tutorial? (
http://www.coin-or.org/Ipopt/documentation/node32.html) I assume that I
won't need "eval_h()". Is that true? or should I make an empty template?

3) For setting the solver name and the limited-memory mode, Sholud I still
use "app->Options()"?i.e., sth like:


>   app->Options()->SetStringValue("linear_solver", "ma27");
>   app->Options()->SetStringValue("hessian_approximation",
> "limited-memory");


4) Out of curiosity,  Is there any other examples of using C++ interface
using auxiliary data and limited-memory or other options of Ipopt?

5) Is there any sample statistics about the execution time saved using C++
interface? I am currently solving my problem using MalabInterface, and it
takes about 30 minutes on a Quad Core with 8GB RAM, I was wondering how
much save can I expect if I port the whole thing to C++?

Thanks
Alireza



More information about the Ipopt mailing list