<br><font size=2 face="sans-serif">We are happily using Cbc in a scheduling
and planning application. We run mostly in a Windows environment.</font>
<br>
<br><font size=2 face="sans-serif">Recently, as the size of our problems
increased, we have started running into what we take to be memory problems.
It may turn out that these problems are simply too big, in which
case we'll look for simplifications. But perhaps there is an easier
answer, and we're hoping that the collective wisdom of COIN-Discuss can
come to our rescue.</font>
<br>
<br><font size=2 face="sans-serif">The problems in question have on the
order of 80 K variables, and something like 300 K constraints. Cbc
fails dramatically, throwing an exception and exiting abnormally. (Because
we are running Cbc via JNI inside a thin Java interface layer, it's hard
for us to see the details of the exception, but we are working on that.)</font>
<br>
<br><font size=2 face="sans-serif">We've duplicated the failure with artificial
test problems with as few as 20 K variables and about 265 K constraints,
with less than 8 M nonzero coefficients.</font>
<br>
<br><font size=2 face="sans-serif">The failure occurs during the call to
CbcModel::initialSolve that precedes initial cut generation; that is, it
happens very early, well before the branch-and-cut phase properly begins.
We narrowed it down to ClpSimplex::dual. In the seconds prior
to failure, we observe memory usage climbing sharply.</font>
<br>
<br><font size=2 face="sans-serif">We hypothesize that Clp is unpacking
the problem in some way; clearly not to a full (non-sparse) matrix representation,
but rather to some intermediate form.</font>
<br>
<br><font size=2 face="sans-serif">It's possible that many of our constraints
are redundant; we're investigating that ourselves. But perhaps we
don't need to make the effort: maybe there is some way to get COIN to prune
the redundant constraints cautiously, without inflating the matrix? Thanks
in advance for any advice that you might have.</font>