[Osi] Question re. meaning of OsiSimplex::getReducedGradient

Matthew Saltzman mjs at clemson.edu
Fri Sep 3 22:59:21 EDT 2010


On Fri, 2010-09-03 at 22:54 -0400, Ted Ralphs wrote: 
> On Fri, Sep 3, 2010 at 10:05 PM, Matthew Saltzman <mjs at clemson.edu> wrote:
> > On Fri, 2010-09-03 at 14:29 -0400, Ted Ralphs wrote:
> >> Well, I think that "reduced gradient" is mixing two different
> >> terminologies. It seems to me that "reduced" really only make sense as
> >> a modifier to "cost." The reduced cost (as I usually define) _is_ a
> >> (sub)gradient of the value function of the linear program (with the
> >> variable bounds considered to be parametric). I'm not sure what a
> >> "reduced gradient" would be. It should really be just a "subgradient,"
> >> technically speaking.
> >
> > Actually, it's standard terminology from NLP.  It's a mapping of the
> > gradient onto the the nullspace of the structural constraints.
> > Projection is one way to do it.  Another is to solve Ax = b for a set of
> > basic variables, then compute the gradient in the "reduced space" of
> > nonbasic variables by substituting out the basic variables in the
> > objective function.
> 
> Now that you say it, I do recall that. So the term "gradient" here is
> actually a gradient of the objective and the reduced gradient is a
> generalization of reduced cost to the nonlinear case, but is precisely
> the same as the reduced cost in the case of linear programming.

Well, it's the full vector of reduced costs, but yes, essentially.

> 
> Cheers,
> 
> Ted

-- 
                Matthew Saltzman

Clemson University Math Sciences
mjs AT clemson DOT edu
http://www.math.clemson.edu/~mjs





More information about the Osi mailing list