[Osi] Question re. meaning of OsiSimplex::getReducedGradient
Ted Ralphs
ted at lehigh.edu
Fri Sep 3 22:54:54 EDT 2010
On Fri, Sep 3, 2010 at 10:05 PM, Matthew Saltzman <mjs at clemson.edu> wrote:
> On Fri, 2010-09-03 at 14:29 -0400, Ted Ralphs wrote:
>> Well, I think that "reduced gradient" is mixing two different
>> terminologies. It seems to me that "reduced" really only make sense as
>> a modifier to "cost." The reduced cost (as I usually define) _is_ a
>> (sub)gradient of the value function of the linear program (with the
>> variable bounds considered to be parametric). I'm not sure what a
>> "reduced gradient" would be. It should really be just a "subgradient,"
>> technically speaking.
>
> Actually, it's standard terminology from NLP. It's a mapping of the
> gradient onto the the nullspace of the structural constraints.
> Projection is one way to do it. Another is to solve Ax = b for a set of
> basic variables, then compute the gradient in the "reduced space" of
> nonbasic variables by substituting out the basic variables in the
> objective function.
Now that you say it, I do recall that. So the term "gradient" here is
actually a gradient of the objective and the reduced gradient is a
generalization of reduced cost to the nonlinear case, but is precisely
the same as the reduced cost in the case of linear programming.
Cheers,
Ted
--
Dr. Ted Ralphs
Associate Professor, Lehigh University
(610) 628-1280
ted 'at' lehigh 'dot' edu
coral.ie.lehigh.edu/~ted
More information about the Osi
mailing list