[Ipopt] Memory / Storage requirements when solving with "exact" vs "limited-memory" hessian approximations

Tony Kelman kelman at berkeley.edu
Sat Jan 26 02:06:00 EST 2013


Mehmet,

The unfortunate answer to your question is it depends completely on the 
problem sparsity structure and numerical values, and which linear solver 
(factorization algorithm) you use. There's no good way of predicting Ipopt's 
memory consumption in a general problem-independent way, since it uses 
sparse symmetric indefinite matrix factorizations to solve for the search 
direction at each iteration. Dense matrices are not used with exact 
Hessians, however some linear solvers do try to aggregate small dense 
sub-blocks together in the course of the factorization for computational 
efficiency (when the sparsity structure of the problem allows for this).

Typically Ipopt has memory allocated for the KKT matrix itself (where the 
user provides the Hessian and Jacobian sparsity patterns) which gets 
overwritten at each iteration when the numerical values change, and another 
region of memory allocated for the matrix factorization result from the 
linear solver. These are both sparse matrix data structures, which scale in 
size with the number of nonzero elements (with some differences between 
triplet or CSR format). The allocation for the factorization varies a bit 
between linear solvers, but generally the memory required may need to be 
reallocated and increased over the course of solving an optimization 
problem, because the pivoting sequence and hence the fill-in can change as 
the matrix values change from iteration to iteration.

Out-of-core solvers rearrange the factorization algorithm to change the data 
access patterns during the matrix factorization. If the KKT matrix has so 
many nonzeros or the factorization results in so much fill-in that the 
factor will not fit in memory, the out-of-core algorithm is structured in a 
smarter way to work on a block (small enough to fit in memory) of the 
problem at a time so disk accesses happen in bulk. The idea is this should 
work better than using virtual memory with a conventional algorithm, in 
which case the disk accesses for virtual memory would be smaller in size and 
more frequent.

The limited memory case computes a low-rank approximation to the Hessian 
based on first-order information. This typically won't converge as quickly 
so in my experience it's really only a good idea if the second-order 
derivatives are extremely expensive to compute, or if they're just 
complicated to code and you care more about implementation time than 
computation time. It might still be reasonable if you've got far more 
inequality constraints than variables for example, in which case the 
Jacobian is big and sparse but the limited-memory Hessian approximation is 
smaller, dense, and low rank. I believe the limited-memory Hessian 
approximation uses dense linear algebra (LAPACK) for some pieces, I'm not 
positive exactly how it's implemented in Ipopt. I forget if there's a 
section in the implementation paper about it, it's worth checking though.

-Tony


-----Original Message----- 
From: Mehmet Ersin YUMER <meyumer at gmail.com>
Date: Thu, 24 Jan 2013 14:44:36 -0500
To: ipopt at list.coin-or.org
Subject: [Ipopt] Memory / Storage requirements when solving with
"exact" vs "limited-memory" hessian approximations

Hi,

I am looking for a way to roughly estimate the required memory size when
using an in-core linear solver (ma27, ma57, etc.) with ipopt, and required
disk storage size when using an out-of core linear solver (ma77).

If:
number of variables = n
number of equality constraints = m
number of inequality constraints = k
number of nonzero in constraint jacobian = j
number of nonzero in lagrangian hessian = h

Is it correct that when using the "exact" hessian, the dominant memory
requirement arises from the search direction matrix, and it is a dense
(n+m)*(n+m) matrix? Or is it more involved than this?

How many of these matrices are stored during the execution?

Is there any difference in size when using an out-of-core linear solver
other than the fact that the matrix is now stored on disk?

What about the "limited-memory case"?


Thanks in advance for your help.

PS: For my problem the values are roughly as follows:
n = 1M
m = 50K
k = 0
j = 1M
h = 500K


-- 
Mehmet Ersin Yumer



More information about the Ipopt mailing list