C++ Reference: class LinearProgrammingConstraint

This documentation is automatically generated.

Method
AddCutGenerator

Return type: void

Arguments: CutGenerator generator

Register a new cut generator with this constraint.

AddLinearConstraint

Return type: void

Arguments: const LinearConstraint& ct

Add a new linear constraint to this LP.

average_degeneracy

Return type: double

Average number of nonbasic variables with zero reduced costs.

DimensionString

Return type: std::string

GetSolutionReducedCost

Return type: double

Arguments: IntegerVariable variable

GetSolutionValue

Return type: double

Arguments: IntegerVariable variable

HasSolution

Return type: bool

Returns the LP value and reduced cost of a variable in the current solution. These functions should only be called when HasSolution() is true. Note that this solution is always an OPTIMAL solution of an LP above or at the current decision level. We "erase" it when we backtrack over it.

HeuristicLPMostInfeasibleBinary

Return type: std::function<LiteralIndex()>

Arguments: Model* model

Returns a LiteralIndex guided by the underlying LP constraints. This looks at all unassigned 0-1 variables, takes the one with a support value closest to 0.5, and tries to assign it to 1. If all 0-1 variables have an integer support, returns kNoLiteralIndex. Tie-breaking is done using the variable natural order. TODO(user): This fixes to 1, but for some problems fixing to 0 or to the std::round(support value) might work better. When this is the case, change behaviour automatically?

HeuristicLPPseudoCostBinary

Return type: std::function<LiteralIndex()>

Arguments: Model* model

Returns a LiteralIndex guided by the underlying LP constraints. This computes the mean of reduced costs over successive calls, and tries to fix the variable which has the highest reduced cost. Tie-breaking is done using the variable natural order. Only works for 0/1 variables. TODO(user): Try to get better pseudocosts than averaging every time the heuristic is called. MIP solvers initialize this with strong branching, then keep track of the pseudocosts when doing tree search. Also, this version only branches on var >= 1 and keeps track of reduced costs from var = 1 to var = 0. This works better than the conventional MIP where the chosen variable will be argmax_var min(pseudocost_var(0->1), pseudocost_var(1->0)), probably because we are doing DFS search where MIP does BFS. This might depend on the model, more trials are necessary. We could also do exponential smoothing instead of decaying every N calls, i.e. pseudo = a * pseudo + (1-a) reduced.

IncrementalPropagate

Return type: bool

Arguments: const std::vector<int>& watch_indices

integer_variables

Return type: const std::vector<IntegerVariable>&

LinearProgrammingConstraint

Return type: explicit

Arguments: Model* model

~LinearProgrammingConstraint

LPReducedCostAverageBranching

Return type: std::function<LiteralIndex()>

Returns a LiteralIndex guided by the underlying LP constraints. This computes the mean of reduced costs over successive calls, and tries to fix the variable which has the highest reduced cost. Tie-breaking is done using the variable natural order.

NumVariables

Return type: int

Propagate

Return type: bool

PropagatorInterface API.

RegisterWith

Return type: void

Arguments: Model* model

SetLevel

Return type: void

Arguments: int level

ReversibleInterface API.

SetMainObjectiveVariable

Return type: void

Arguments: IntegerVariable ivar

The main objective variable should be equal to the linear sum of the arguments passed to SetObjectiveCoefficient().

SetObjectiveCoefficient

Return type: void

Arguments: IntegerVariable ivar, IntegerValue coeff

Set the coefficient of the variable in the objective. Calling it twice will overwrite the previous value.

SolutionIsInteger

Return type: bool

SolutionObjectiveValue

Return type: double