# Branch and cut

Branch and cut is a method of combinatorial optimization for solving integer linear programs (ILPs), that is, linear programming (LP) problems where some or all the unknowns are restricted to integer values. Branch and cut involves running a branch and bound algorithm and using cutting planes to tighten the linear programming relaxations. Note that if cuts are only used to tighten the initial LP relaxation, the algorithm is called cut and branch.

## Algorithm

This description assumes the ILP is a maximization problem.

The method solves the linear program without the integer constraint using the regular simplex algorithm. When an optimal solution is obtained, and this solution has a non-integer value for a variable that is supposed to be integer, a cutting plane algorithm may be used to find further linear constraints which are satisfied by all feasible integer points but violated by the current fractional solution. These inequalities may be added to the linear program, such that resolving it will yield a different solution which is hopefully "less fractional".

At this point, the branch and bound part of the algorithm is started. The problem is split into multiple (usually two) versions. The new linear programs are then solved using the simplex method and the process repeats. During the branch and bound process, non-integral solutions to LP relaxations serve as upper bounds and integral solutions serve as lower bounds. A node can be pruned if an upper bound is lower than an existing lower bound. Further, when solving the LP relaxations, additional cutting planes may be generated, which may be either global cuts, i.e., valid for all feasible integer solutions, or local cuts, meaning that they are satisfied by all solutions fulfilling the side constraints from the currently considered branch and bound subtree.

The algorithm is summarized below.

1. Add the initial ILP to $L$ , the list of active problems
2. Set $x^{*}={\text{null}}$ and $v^{*}=-\infty$ 3. while $L$ is not empty
1. Select and remove (de-queue) a problem from $L$ 2. Solve the LP relaxation of the problem.
3. If the solution is infeasible, go back to 3 (while). Otherwise denote the solution by $x$ with objective value $v$ .
4. If $v\leq v^{*}$ , go back to 3.
5. If $x$ is integer, set $v^{*}\leftarrow v,x^{*}\leftarrow x$ and go back to 3.
6. If desired, search for cutting planes that are violated by $x$ . If any are found, add them to the LP relaxation and return to 3.2.
7. Branch to partition the problem into new problems with restricted feasible regions. Add these problem to $L$ and go back to 3
4. return $x^{*}$ ### Pseudocode

In C++-like pseudocode, this could be written:

// ILP branch and cut solution pseudocode, assuming objective is to be maximized
ILP_solution branch_and_cut_ILP(IntegerLinearProgram initial_problem) {
queue active_list; // L, above
active_list.enqueue(initial_problem); // step 1
// step 2
ILP_solution optimal_solution; // this will hold x* above
double best_objective = -std::numeric_limits<double>::infinity; // will hold v* above
while (!active_list.empty()) { // step 3 above
LinearProgram& curr_prob = active_list.dequeue(); // step 3.1
do { // steps 3.2-3.7
RelaxedLinearProgram& relaxed_prob = LP_relax(curr_prob); // step 3.2
LP_solution curr_relaxed_soln = LP_solve(relaxed_prob); // this is x above
bool cutting_planes_found = false;
if (!curr_relaxed_soln.is_feasible()) { // step 3.3
continue; // try another solution; continues at step 3
}
double current_objective_value = curr_relaxed_soln.value(); // v above
if (current_objective_value <= best_objective) { // step 3.4
continue; // try another solution; continues at step 3
}
if (curr_relaxed_soln.is_integer()) { // step 3.5
best_objective = current_objective_value;
optimal_solution = cast_as_ILP_solution(curr_relaxed_soln);
continue; // continues at step 3
}
// current relaxed solution isn't integral
if (hunting_for_cutting_planes) { // step 3.6
violated_cutting_planes = search_for_violated_cutting_planes(curr_relaxed_soln);
if (!violated_cutting_planes.empty()) { // step 3.6
cutting_planes_found = true; // will continue at step 3.2
for (auto&& cutting_plane : violated_cutting_planes) {
active_list.enqueue(LP_relax(curr_prob, cutting_plane));
}
continue; // continues at step 3.2
}
}
// step 3.7: either violated cutting planes not found, or we weren't looking for them
auto&& branched_problems = branch_partition(curr_prob);
for (auto&& branch : branched_problems) {
active_list.enqueue(branch);
}
continue; // continues at step 3
} while (hunting_for_cutting_planes /* parameter of the algorithm; see 3.6 */
&& cutting_planes_found);
// end step 3.2 do-while loop
} // end step 3 while loop
return optimal_solution; // step 4
}


In the above pseudocode, the functions LP_relax, LP_solve and branch_partition called as subroutines must be provided as applicable to the problem. For example, LP_solve could call the simplex algorithm. Branching strategies for branch_partition are discussed below.

## Branching strategies

An important step in the branch and cut algorithm is the branching step. At this step, there are a variety of branching heuristics that can be used. The branching strategies described below all involve what is called branching on a variable. Branching on a variable involves choosing a variable, $x_{i}$ , with a fractional value, $x_{i}'$ , in the optimal solution to the current LP relaxation and then adding the constraints $x_{i}\leq \left\lfloor x_{i}'\right\rfloor$ and $x_{i}\geq \left\lceil x_{i}'\right\rceil$ Most infeasible branching
This branching strategy chooses the variable with the fractional part closest to 0.5.
Pseudo cost branching
The basic idea of this strategy is to keep track for each variable $x_{i}$ the change in the objective function when this variable was previously chosen as the variable to branch on. The strategy then chooses the variable that is predicted to have the most change on the objective function based on past changes when it was chosen as the branching variable. Note that pseudo cost branching is initially uninformative in the search since few variables have been branched on.
Strong branching
Strong branching involves testing which of the candidate variable gives the best improvement to the objective function before actually branching on them. Full strong branching tests all candidate variables and can be computationally expensive. The computational cost can be reduced by only considering a subset of the candidate variables and not solving each of the corresponding LP relaxations to completion.

There are also a large number of variations of these branching strategies, such as using strong branching early on when pseudo cost branching is relatively uninformative and then switching to pseudo cost branching later when there is enough branching history for pseudo cost to be informative.