[Dip] DipPy Questions

Romain Montagné r.montagne at hotmail.fr
Sun Jul 26 15:18:22 EDT 2015


Hello Ted,

Thank you for the clarity and  details of your answer, it is very much appreciated.

Concerning my questions on the branching aspects, I would have liked to branch as follows: given two vertices u and v of a graph, i would like to impose on one branch the constraint that u and v  are in a same color class, and in the other branch the constraint that they are not. I do not think it is possible with only two branches.

 I have also seen papers in which one branches on a sum of  master problem variables which is why I was wondering if it was possible to call these variables in the branching methods, but I understand this is difficult to implement.

Concerning the "ConvexDual" parameter changed to "Target", your explanations are very clear. I am not using the latest version of Dip so I cannot do it at the moment but I will look into it. 

Thanks again, and I will be happy to send you a copy of my work with Dippy once it is properly finished.

Cheers,

Romain



Le 21 juil. 2015 à 12:11, Ted Ralphs a écrit :

> 
> 
> On Sun, Jul 12, 2015 at 8:09 PM, Romain Montagné <r.montagne at hotmail.fr> wrote:
> Hello Dippy Community,
> 
> I am a PhD candidate and am currently working with Dippy. I am very impressed by both its performances and by how user friendly it is. It is a pleasure!
> 
> However, some points remain mysterious:
> 
> 1. Concerning the branch_method option:
> 
> 	- Is it possible to create more than two branches at a given node? 
> 
> At the moment, this is not possible, but it is something I would also very much like to have. The underlying branch and bound framework is capable of it, we just need to make some modifications to DIP itself. I'm not sure how difficult this will be. I guess it will be of medium difficulty.
> 
> 	- When the doPriceCut option is on, what is the default branching strategy that is used? 
> 
> It is the same branching strategy as in branch and cut. It branches on variables in the original compact formulation. It selects the branching variable fairly naively at the moment, using a most fractional selection rule. Changing the branching strategy to something like pseudocost branching or reliability branch should not be too difficult. It's also on the TODO list. 
>  
> 	- If the doPriceCut option is on, is it possible to branch on the variables of the master problem? As these variables are automatically generated by Dippy and not defined by the user, how can one call them?
> 
> This is difficult to allow in general, since fixing a single variable in the master to value zero means you have to disallow the generation of the corresponding solution when solving the subproblem (I guess you are aware of this). This could be done in certain special cases by adding constraints to the subproblem, but this may destroy the structure and would be difficult to do in a generic way (I guess you are aware of this, too). What is it that you want to do exactly?
>  
> 	- If the doPriceCut option is on, is it possible to branch by adding constraints to the subproblem instead of returning sets of variables?
> 
> Not at the moment. This would probably be difficult to add. If you give details of what you want to do, we can see what's possible.
>  
> 	- If the doPriceCut option is on, is it possible to add customized cuts for the master problem?
> 
> Not at the moment. This would also be pretty difficult to support. The whole design of DIP is premised on having an original compact formulation that corresponds to the current Dantzig-Wolfe reformiulation. Adding cuts in the master would destroy this mapping. I have considered adding a mode in DIP where there is no original compact formulation and the algorithm just does standard column generation, in which case what you are asking for (both with respect to branching and adding cuts) might be possible, but this would probably require significant effort.
>  
> 	- If the doPriceCut is on, are customized cuts added by the generate_cuts method taken into account in the subproblem?
> 
> No, they are added to the original compact formulation and to the Dantzig-Wolfe reformulation, i.e., they are treated as complicating constraints. Allowing globally valid cuts to be added to the subproblem is a possibility. It doesn't seem like that would be terribly difficult. Locally valid cuts for the subproblem would probably be more difficult. 
>  
> 2. Concerning the relaxed_solver option: 
> 
> 	- In the documentation available online, examples show that the variable convexDual is an input parameter of the method. However, in the examples of the Dippy distribution, there is no such 				parameter. I have found links on the web showing that the method was updated, and that it is no longer a parameter. So how do we access this variable? Is it hidden somewhere in redCosts?
> 
> Yes, the documentation is a little out of date. When solving the subproblem to optimality, it is not really necessary for the user to know the value of convexDual---as long as an optimal solution is returned, DIP calculates the reduced cost and decides whether the variable should be added internally. The user has only to return the optimal solution to the subproblem without considering the value of convexDual. Therefore, to simplify the interface (since most users do not know what "convexDual" means), I had deleted this argument in version 0.9. 
> 
> However, I realized that if the user wants to use a heuristic to solve the subproblem, this is problematic. In this case, the user should probably know what the target value of the objective is in order to know whether the heuristic is generating useful columns or not. Therefore, in version 0.91, I added the argument back but called it "target," which I thought would be less confusing. Of course, if you know what "convexDual" is, it may be more confusing :). Does this make sense? What version of Dippy are you using? If you are using the most recent release, you should have access to this extra argument again.
>  
> 	- In the documentation available online, the example proposed is a dynamic programming algorithm for the subproblem, which is optimal. Does the method implemented in relaxed_solver have to be 		optimal or can it be a heuristic (meaning does DipPy check whether it is optimal or not)? 
> 
> This is something we have recently addressed. In the current release version, the user may supply a heuristic solution but there is no way for the user to indicate that this is the case. Before concluding there are no columns with negative reduced costs, DIP therefore does a final check by solving the subproblem as a generic MILP. In the trunk version, we have now added some additional return parameters so the user can indicate the status of the subproblem solve (whether the solution is optimal or not and in the case that no solution is returned, whether this is because no solution of negative reduced cost exists or it is just that the user's heuristic did not find such a solution). This means you can use a quick heuristic until that fails and then let DIP solve the subproblem to optimality as a generic MIP once the heuristic fails to find a solution with negative reduced cost. On the other hand, you can solve the subproblem to optimality yourself and avoid DIP's internal check.
>  
> Thank you very much for your help.
> 
> No problem! Please let me know how I can help.
> 
> Cheers,
> 
> Ted 
> -- 
> Dr. Ted Ralphs
> Professor, Lehigh University
> (610) 628-1280
> ted 'at' lehigh 'dot' edu
> coral.ie.lehigh.edu/~ted

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://list.coin-or.org/pipermail/dip/attachments/20150726/5b0aa1c2/attachment.html>


More information about the Dip mailing list