[Ipopt] conditional gradient

Stefan Vigerske stefan at math.hu-berlin.de
Tue Nov 28 04:27:03 EST 2017


I don't think I can really help here and you probably found this 
already, but a good starting point is to look into the
AlgorithmBuilder::BuildBasicAlgorithm() function and the functions that 
are called from there. This puts together the algorithm that Ipopt is 
executing and there you should probably add in your own extensions.
Overwriting PDSystemSolver does not sound good.


On 11/27/2017 04:37 PM, Nicolas Essis-Breton wrote:
> Hi,
> I would like to experiment with conditional gradient in ipopt.
> Normal ipopt flow is:
> query user hessian -> find search direction -> filter line search
> I would like to do:
> query user hessian -> find search direction -> let user modify search
> direction -> filter line search
> I can overload IpoptAlgorithm::ComputeSearchDirection, but is this enough?
> For example, for a second-order correction
> FilterLSAcceptor::TrySecondOrderCorrection recalculates the search
> direction.
> Would it be better to overload PDSystemSolver::Solve?
> The idea of conditional gradient is to modify any computed search
> direction, before it is actually used by the optimization algorithm.
> This is sometimes called 'projected gradient' or 'frank-wolfe' method.
> To be sure I have overloaded all the necessary parts, your help would be
> great.
> Thanks,
> Nicolas
> _______________________________________________
> Ipopt mailing list
> Ipopt at list.coin-or.org
> https://list.coin-or.org/mailman/listinfo/ipopt

More information about the Ipopt mailing list