ON A NEW CONSTRAINT REDUCTION HEURISTIC USING IMPROVED BISECTION METHOD FOR MIXED INTEGER LINEAR PROGRAMMING

In this study, we develop a surrogate relaxation-based procedure to reduce mixed-integer linear programming (MILP) problem sizes. This technique starts with one surrogate constraint which is a nonnegative linear combination of multiple constraints of the problem. At this initial step, we calculate optimal Lagrangian multipliers from LP relaxation of the problem and use them as initial surrogate multipliers. We incorporate the improved bisection method (IBM) (B. Gavish, F. Glover, and H. Pirkul, Surrogate Constraints in Integer Programming, J. Inform. Optim. Sci. 12(2) (1991), 219–228.) into our algorithm. This simple heuristic algorithm is designed to iteratively generate a new surrogate cut that is to guarantee to satisfy the most violated two constraints of the corresponding iteration. The performance of the heuristic is tested using both some problems from the OR libraries and randomly generated ones.


Introduction
The objective here is to attempt to reduce the number of constraints of MILP type of problems. In the surrogate relaxation, a subset of constraints is substituted by a linear combination of these constraints, that is, we multiply these constraints with non-negative multipliers, and then aggregate them together to generate one combined constraint. With the optimal surrogate multipliers, this combined constraint serves as a proxy for the others and captures useful information. By doing that, we may obtain infeasible but near-optimal solutions while reducing processing time and complexity relative to the original model. Int. J. Anal. Appl. 19 (1) (2021) 66 Besides, the use of surrogate constraints is known to provide better bounds relative to Lagrangian bounds.
In the proposed heuristic, as initial surrogate multipliers, we simply use the optimal dual values (Lagrangian multipliers) of the LP relaxation to get the strongest initial surrogate constraint. We incorporate the IBM [12], which is an updated version of the bisection algorithm [13], into our heuristic. At each iteration, the procedure evaluates two infeasible constraints at a time, and uses the IBM to find the optimal surrogate multipliers while upper (or lower) bound decreases (or increases). Furthermore, we generate not one but several surrogate constraints and only relax inequality constraints. So, we solve a sequence of computationally easy relaxed and reduced problems to reach the optimal solution in a small amount of time.
Our proposed heuristic also enables us to examine which ratio of the problem constraints are redundant and which ones are critical. The heuristic is implemented in the MATLAB environment. In order to test the algorithm, we use problems available for download from Beasley's OR Library [4] and the Mixed Integer Programming Library [23]. We also conduct a set of testings using randomly generated instances. All of the computational experiments are done on a PC with an Intel Core i5-7400 CPU (3.00 GHz) and 4 GB of RAM.
Consider the following MILP problem: where c, x ∈ R n , b ∈ R m , A ∈ R m×n and X is a discrete set that may be defined by some linear equalities and bound constraints of the decision variables.
By moving the constraints into the objective function, we generate Lagrange relaxation: where Lagrange multipliers vector λ ≥ 0.
Analogously, by assembling multiple constraints into a single new surrogate constraint, we generate surrogate relaxation: where surrogate multipliers vector µ ≥ 0.
Both techniques enlarge the feasible region and provide a lower bound on the optimal objective value of Problem (P ). But, surrogate lower bound is tighter than the Lagrangian lower bound [14,17].
Relaxation-based search or dual algorithms, both exact and heuristics, have been extensively used in finding bounds for integer programming (IP). Let us now survey the related literature which investigates surrogate relaxation-based heuristics to solve combinatorial optimization problems. In [15], the author proposed a class of surrogate constraint heuristics that provide a variety of supplementing new alternatives and independent solution strategies. Lorena and Narciso [31,39] proposed six heuristics based on both the surrogate and Lagrangian relaxations and a subgrandient search algorithm for large scale generalized assignment problems.
The authors showed that the use of procedures based on surrogate constraint analysis is effective for satisfiability problem in [30]. Applications of a classical combinatorial optimization problem called the set covering were given in [1,11] [43]. For a review of the approaches which combine metaheuristics with exact IP techniques, see [42].
In [27], a critical event tabu search heuristic was presented to solve the multidimensional knapsack problems (MKPs) with generalized upper bound constraints. Osorio et al. [41] considered a combined cutting and surrogate constraint analysis strategy for MKPs, see also [19]. The authors introduced problem-size reduction heuristics for MKPs in [21]. For heuristics and metaheuristics for MKPs and their variants, see recent works of [2,5,6,8,18,20,24,25,28,29,33,35,[47][48][49][50], and references therein. For graph theory applications, such as graph coloring, weighted maximum clique, and shortest path problems see [9,16,37,38]. Choi and Choi [7] proposed a redundancy identification method that is based on surrogate constraints. The relaxation adaptive memory programming approach based on surrogate constraints was proposed for combinatorial optimization problems in [44] and for capacitated minimum spanning tree problems in [45]. To review the scatter search method which was conceived as an extension of surrogate constraint relaxation, see [34].
The rest of this paper is organized as follows. Section 2 gives a brief version of the IBM, followed by the proposed heuristic is introduced in Section 3. Section 4 dercribes computational experiments. Section 5 concludes the paper.

Computing the optimal surrogate multipliers by IBM
Now, we consider problems with only two constraints. We start by determining which constraint is tighter, namely, it produces a lower (greater) objective value for maximization (minimization) problem considering only one constraint and ignoring the other one. Renumarate the tighter constraint as Constraint 1, and the other as Constraint 2. The surrogate multiplier of Constraint 1 is constant and equals to 1. Other multiplier is µ which is going to be updated. The algorithm can be summarized as follows: Step 1. Initial values of µ L and µ H are as following. The solution of the problem with multipliers (1, µ L ) and (1, µ H ) satisfies and does not satisfy Constraint 1, respectively.
Step 2. Let µ = (µ L + µ H )/2. Solve the problem with the multipliers (1, µ). If (i) both constraints are satisfied stop, (1, µ) is the optimal multipliers, and the optimal solution of the problem is obtained.
(ii) only Constraint 1 is satisfied let µ L = f g where f is the amount of oversatisfaction of Constraint 1 and g is the amount of undersatisfaction of Constraint 2.
(iii) only Constraint 2 is satisfied let µ H = f g where f is the amount of undersatisfaction of Constraint 1 and g is the amount of oversatisfaction of Constraint 2.
Step 3. If µ L < µ H go to Step 2, otherwise Stop, (1, µ) is the optimal multipliers. Note that, we can also Stop if the objective function does not improve, or if the number of iterations reached an upper limit.

Constraint reduction heuristic
To determine the surrogate multipliers of the most violated two constraints of the current step, we apply the following iterative process which finds appropriate surrogate constraints.
Initial solution: We start by calculating optimal Lagrangian multipliers from LP relaxation of the problem and use them as initial surrogate multipliers. This surrogate constraint is our first surrogate constraint.
Then, we solve our problem with this constraint and integrality restrictions.
Adding a new surrogate constraint (generating a cut): We first determine the most violated two constraints. Let Add a new surrogate constraint as µ t g t (x) + µ k g k (x) ≤ 0 where µ t = µ k = 1. Solve the problem adding this new surrogate constraint to prior surrogate constraint(s). If g t (x) and g k (x) are non-positive while µ t = µ k = 1, add another surrogate constraint. Otherwise, apply the IBM supposing the positive one as Constraint 1. If both constraints are satisfied then add a new surrogate constraint. Repeat these steps until all constraints are satisfied (within a tolerance value). Pseudo-code is given in Algorithm 3.1.

Computational results
This section presents the experimental results obtained with the proposed simple heuristic. Both test problems from the OR libraries [4,23] and randomly generated ones are used to conduct the computational experiment. We only consider MILP problem cases which have non-negative Lagrange multipliers and at least one of them is non-zero. If this is not the case, we can assume that all of the initial surrogate multipliers are 1.
We denote the number of constraints by m, and the number of variables by n. where s i is a slackness ratio and drawn from uniform distribution between 0.65 and 0.95. For each combination of (m, n), we generate 10 problems. Table 1 gives, for each combination of m × n, minimum, average (rounded off) and maximum computing times for both the intlinprog solver and the proposed heuristic. The heuristic performs better when the methods are compared by their average execution times which are less except for only one case. Table 1 indicates average execution times are reduced for almost every uncorrelated instance in which expected slackness ratio is 0.80.
In addition to the MKP instances, we test the proposed heuristic on a few instances from OR libraries [4,23]. Refer to Table 2 for the results.
Note that, in all our calculations, the tolerance value are fixed to 10 −6 , the algorithm terminates if the upper bound does not improve 30 times, and for the IBM, the iteration upper limit is set to 10.

Conclusions
In this paper, we try to provide equivalent formulations with a fewer number of constraints for combinatorial optimization problems. To do that, we present a surrogate cut generation procedure based on the IBM. Experiments performed on problems with lots of redundant constraints have shown that the proposed heuristic gives reduced models which can be solved significantly faster than original models.  if e(ind1) > 0 and e(ind2) < 0 then 38: if e(ind1) > T oL then 39: µ(k, ind2) = 0.5 40: SolveSur(A, b, c, X, µ) 41: while (e(ind1) > T oL or e(ind2) > T oL) and r <= IterU pLim do if mlow <= mhigh then 45: µ(k, ind2) = (mlow + mhigh)/2 46: SolveSur(A, b, c, X, µ) 47: r + + 48: if mlow == mhigh then  else if e(ind1) > T oL and e(ind2) < T oL then  Data Availability: The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest:
The author declares that there are no conflicts of interest regarding the publication of this paper.