[Ipopt] Numerical Approximation of Problem Functions - Gradient of objective, Jacobian of constraints and Hessian of Lagrangian

Stefan Vigerske svigerske at gams.com
Sat Dec 28 22:55:01 EST 2019


Hi,

there is an option to enable approximation of the Hessian 
(https://coin-or.github.io/Ipopt/SPECIALS.html#QUASI_NEWTON).
There is also an hidden option to enable a finite difference 
approximation for the Jacobian, but I guess it's hidden because 
performance of Ipopt may be terrible.
It would be more advisable if you could make use of some automatic 
differentiation package. ADOL-C, CasADi, and CppAD include specialized 
interfaces to use them with Ipopt.

Stefan

On 12/29/19 2:44 AM, Reid Byron wrote:
> Hello All,
> 
> 
> 
> I have implemented a trajectory optimizer using non-linear programming and
> collocation per Hargraves and Paris’s canonical paper on the method linked
> below.
> 
> 
> 
> I have been able to demonstrate the technique on a small scale toy problem
> using the Sequential Least Squares Quadratic Programming [SLSQP ] solver
> included within the scipy minimize function.
> 
> 
> 
> I am now in the process of selecting and integrating a more capable solver
> to handle problems of greater size and complexity; to this end IPOPT looks
> proven and promising.
> 
> 
> 
> I will be interfacing with IPOPT though C++.  My question pertains to
> Section 3.2 Figure 2 Item 5 regarding the Evaluation of Problem Functions
> within the Introduction to Ipopt document linked below.
> 
> 
> 
> In the hs071 example problem analytic expressions for the Gradient of the
> objective, Jacobian of the constraints and Hessian of the Lagrangian are
> derived.  Obtaining analytic expressions for these problem functions is
> prohibitively difficult for the collocation method I am implementing.  I
> can however approximate these quantities by finite difference.
> 
> 
> 
> *My question is thus – *
> 
> 1.       Is it advisable to write a function which obtains the Gradient of
> the objective, Jacobian of the constraints and Hessian of the Lagrangian by
> finite difference
> 
> 2.       Is there a utility existent in IPOPT which can obtain numerically
> obtain the Gradient of the objective, Jacobian of the constraints and
> Hessian of the Lagrangian for me?
> 
> 
> 
> Thank you for your time and assistance.
> 
> 
> 
> Reid
> 
> 
> 
> 
> 
> Direct Trajectory Optimization Using Nonlinear Programming and Collocation
> 
> https://www.researchgate.net/publication/230872953_Direct_Trajectory_Optimization_Using_Nonlinear_Programming_and_Collocation
> 
> 
> 
> Introduction to Ipopt
> 
> https://projects.coin-or.org/Ipopt/browser/stable/3.10/Ipopt/doc/documentation.pdf?format=raw
> 
> 
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> https://list.coin-or.org/mailman/listinfo/ipopt
> 



More information about the Ipopt mailing list