# C++ Reference: class LinearProgrammingConstraint

This documentation is automatically generated.

Method | |
---|---|

`AddCutGenerator` | Return type: Arguments: Register a new cut generator with this constraint. |

`AddLinearConstraint` | Return type: Arguments: Add a new linear constraint to this LP. |

`average_degeneracy` | Return type: Average number of nonbasic variables with zero reduced costs. |

`DimensionString` | Return type: |

`GetSolutionReducedCost` | Return type: Arguments: |

`GetSolutionValue` | Return type: Arguments: |

`HasSolution` | Return type: Returns the LP value and reduced cost of a variable in the current solution. These functions should only be called when HasSolution() is true. Note that this solution is always an OPTIMAL solution of an LP above or at the current decision level. We "erase" it when we backtrack over it. |

`HeuristicLPMostInfeasibleBinary` | Return type: Arguments: Returns a LiteralIndex guided by the underlying LP constraints. This looks at all unassigned 0-1 variables, takes the one with a support value closest to 0.5, and tries to assign it to 1. If all 0-1 variables have an integer support, returns kNoLiteralIndex. Tie-breaking is done using the variable natural order. TODO(user): This fixes to 1, but for some problems fixing to 0 or to the std::round(support value) might work better. When this is the case, change behaviour automatically? |

`HeuristicLPPseudoCostBinary` | Return type: Arguments: Returns a LiteralIndex guided by the underlying LP constraints. This computes the mean of reduced costs over successive calls, and tries to fix the variable which has the highest reduced cost. Tie-breaking is done using the variable natural order. Only works for 0/1 variables. TODO(user): Try to get better pseudocosts than averaging every time the heuristic is called. MIP solvers initialize this with strong branching, then keep track of the pseudocosts when doing tree search. Also, this version only branches on var >= 1 and keeps track of reduced costs from var = 1 to var = 0. This works better than the conventional MIP where the chosen variable will be argmax_var min(pseudocost_var(0->1), pseudocost_var(1->0)), probably because we are doing DFS search where MIP does BFS. This might depend on the model, more trials are necessary. We could also do exponential smoothing instead of decaying every N calls, i.e. pseudo = a * pseudo + (1-a) reduced. |

`IncrementalPropagate` | Return type: Arguments: |

`integer_variables` | Return type: |

`LinearProgrammingConstraint` | Return type: Arguments: |

`~LinearProgrammingConstraint` | |

`LPReducedCostAverageBranching` | Return type: Returns a LiteralIndex guided by the underlying LP constraints. This computes the mean of reduced costs over successive calls, and tries to fix the variable which has the highest reduced cost. Tie-breaking is done using the variable natural order. |

`NumVariables` | Return type: |

`Propagate` | Return type: PropagatorInterface API. |

`RegisterWith` | Return type: Arguments: |

`SetLevel` | Return type: Arguments: ReversibleInterface API. |

`SetMainObjectiveVariable` | Return type: Arguments: The main objective variable should be equal to the linear sum of the arguments passed to SetObjectiveCoefficient(). |

`SetObjectiveCoefficient` | Return type: Arguments: Set the coefficient of the variable in the objective. Calling it twice will overwrite the previous value. |

`SolutionIsInteger` | Return type: |

`SolutionObjectiveValue` | Return type: |