Relaxation is a commonly used method in integer or mixed integer programming. The author’s understanding of relaxation is that it relaxes requirements for some complex constraints or variables and preserve some of the original problem. In this way it can get a bound for the original problem.
Lagrangian Relaxation is applied to relax the complex contraints, which are added into the objective function as a penalty. And the relaxed problem becomes an easier problem with simple constraints that can be solved swiftly. Or it can be divided into simple sub-problems with polynomial time algorithm that can be applied.
So it should be known that the lagrangian relaxation obtains a lower bound of the original minimum problem, that is, an infeasible solution quite possibly.
Therefore, after getting the optimal solution of lagrangian relaxation problem, we still need a heuristic algorithm to repair the obtained infeasible solution into a feasible solution, which can help get a upper bound of the original problem.
Assume that we have the following problem:
Z I P = m i n c T x + d T y A x ≥ b B x ≥ d x ∈ Z + Z_{IP} = min \quad c^{T}x + d^{T}y\\ Ax \geq b\\ Bx \geq d\\ x\in \mathbb{Z^{+}} ZIP=mincTx+dTyAx≥bBx≥dx∈Z+
It has 2 constraints, and we can apply lagrangian relaxation by add the constraint A x ≥ b Ax \geq b Ax≥b to the objective function.
And we get the lagrangian relaxation problem.
Z L R ( λ ) = m i n c T x + λ T ( b − A x ) B x ≥ d x ∈ Z + Z_{LR}(\lambda) = min \quad c^{T}x + \lambda^{T}(b-Ax)\\ Bx \geq d\\ x\in \mathbb{Z^{+}} ZLR(λ)=mincTx+λT(b−Ax)Bx≥dx∈Z+
λ \lambda λ is called the lagrange multiplier.
It should be noticed that the constraint decides whether λ \lambda λ is positive or negative.
We can understand λ T ( b − A x ) \lambda^{T}(b-Ax) λT(b−Ax) as the penalty function. Here A x ≥ b Ax \geq b Ax≥b, and b − A x ≤ 0 b - Ax \leq 0 b−Ax≤0. So λ \lambda λ is positive. Then, λ T ( b − A x ) \lambda^{T}(b-Ax) λT(b−Ax) is negative, the LR is the lower bound of the original problem.
Similarly, if the penalty function transformed from a constraint is positive, λ \lambda λ should be negative. If the penalty function is zero, there is no demands on λ \lambda λ.
And obviously, here we have
Z L R ( λ ) ≤ Z I P Z_{LR}(\lambda) \leq Z_{IP} ZLR(λ)≤ZIP
For every λ \lambda λ, there is a lagrangian relaxation problem Z L R ( λ ) Z_{LR}(\lambda) ZLR(λ). So we need to find the best λ \lambda λ to make the Z L R ( λ ) Z_{LR}(\lambda) ZLR(λ) big enough, which leads us to the dual problem Z D = m a x λ ≥ 0 Z L R ( λ ) Z_{D} = max_{\lambda \geq 0}Z_{LR}(\lambda) ZD=maxλ≥0ZLR(λ).
Here subgradient algorithm is usually applied to solve the dual problem and update Lagrange multipliers.
For some problems with linked/coupling constraints, for example
Z I P = m i n c T x A x ≥ b x = y B y ≥ d x , y ∈ Z + Z_{IP} = min \quad c^{T}x \\ Ax \geq b\\ x=y\\ By \geq d\\ x,y \in \mathbb{Z^{+}} ZIP=mincTxAx≥bx=yBy≥dx,y∈Z+
Here constraint x = y x=y x=y links x x x and y y y.
So if we relax this constraint, we can get,
Z L R ( λ ) = m i n c T x + λ T ( x − y ) A x ≥ b B y ≥ d x , y ∈ Z + Z_{LR}(\lambda) = min \quad c^{T}x + \lambda^{T}(x - y)\\ Ax \geq b\\ By \geq d\\ x,y \in \mathbb{Z^{+}} ZLR(λ)=mincTx+λT(x−y)Ax≥bBy≥dx,y∈Z+
We can divide it into 2 sub-problems,
Z L R 1 ( λ ) = m i n c T x + λ T x A x ≥ b x ∈ Z + Z_{LR1}(\lambda) = min \quad c^{T}x + \lambda^{T}x\\ Ax \geq b\\ x \in \mathbb{Z^{+}} ZLR1(λ)=mincTx+λTxAx≥bx∈Z+
Z L R 2 ( λ ) = m i n − λ T y B y ≥ d y ∈ Z + Z_{LR2}(\lambda) = min -\lambda^{T}y\\ By \geq d\\ y \in \mathbb{Z^{+}} ZLR2(λ)=min−λTyBy≥dy∈Z+
Lagrangian relaxation is a practical, simple and widely used method in problems, such as the (capacitated) facility location problem. However, as any of the method, there is alway huge gap between theory to practice.
For more detailed theory learning for beginners:
1. 整数规划的拉格朗日松弛(理论分析+Python代码实现) - 王源
2. 拉格朗日松弛-刘林冬
For code implementation learning, I strongly recommend that readers refer to the below articles:
1. 使用拉格朗日松弛求解约束最短路径问题 ( Matlab) - Mingrui Yu
2. Lagrange Relaxation求解TSP ( Java) - 数据魔术师