[Clp] Solving a lot of small and similar LPs

Carlos Eduardo Knippschild carlos.eduardo at audaces.com.br
Fri Feb 5 12:43:39 EST 2010


Hello.

I'd like to know if anyone has some suggestions on how to improve solving
performance for our case, for which we're currently using CLP.

We're trying to solve tens to hundreds of thousands of dynamically generated
LPs, ranging from 30 variables by 500 constraints to a maximum of 400
variables by 100 000 constraints. Each new generated LP is a small
modification from the one previously solved, where these modifications are
always the addition or removal of constraints (the number of variables
remains always the same) by using methods "addRow" and "deleteRows" on the
existing model.

I've already played a little bit with some of the parameters I've found, and
so far I discovered that by disabling presolve and scaling we improve a
little our times.

So, does anyone have any ideas on how to get even better results? Are there
better ways of keeping this ever changing model? What other parameters
should I play with?

Or even, if another project from COIN-OR would be more suitable for our
case. Recently I read about DyLP: would it be an interesting option for this
case?


Thank you for your time! Any help will be much appreciated.

Regards,
--
Carlos E. Knippschild
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/clp/attachments/20100205/c98dbb8e/attachment.html>


More information about the Clp mailing list