C++ Reference: one_tree_lower_bound
Note: This documentation is automatically generated.
An implementation of the Held-Karp symmetric Traveling Salesman (TSP) lower bound algorithm, inspired by "Estimating the Held-Karp lower bound for the geometric TSP" by Christine L. Valenzuela and Antonia J. Jones, European Journal of Operational Research, Volume 102, Issue 1, 1 October 1997, Pages 157-175.The idea is to compute minimum 1-trees to evaluate a lower bound to the corresponding TSP. A minimum 1-tree is a minimum spanning tree on all nodes but one, to which are added the two shortest edges from the left-out node to the nodes of the spanning tree. The sum of the cost of the edges of the minimum 1-tree is a lower bound to the cost of the TSP. In order to improve (increase) this lower bound, the idea is to add weights to each nodes, weights which are added to the cost function used when computing the 1-tree. If weight[i] is the weight of node i, the cost function therefore becomes weighed_cost(i,j) = cost(i,j) + weight[i] + weight[j]. One can see that w = weighed_cost(minimum 1-tree) - Sum(2 * weight[i]) = cost(minimum 1-tree) + Sum(weight[i] * (degree[i] - 2)) is a valid lower bound to the TSP: 1) let T be the set of 1-trees on the nodes; 2) let U be the set of tours on the nodes; U is a subset of T (tours are 1-trees with all degrees equal to 2), therefore: min(t in T) Cost(t) <= min(t in U) Cost(t) and min(t in T) WeighedCost(t) <= min(t in U) WeighedCost(t) 3) weighed_cost(i,j) = cost(i,j) + weight[i] + weight[j], therefore: for all t in T, WeighedCost(t) = Cost(t) + Sum(weight[i] * degree[i]) and for all i in U, WeighedCost(t) = Cost(t) + Sum(weight[i] * 2) 4) let t* in U s.t. WeighedCost(t*) = min(t in U) WeighedCost(t), therefore: min(t in T) (Cost(t) + Sum(weight[i] * degree[i])) <= Cost(t*) + Sum(weight[i] * 2) and min(t in T) (Cost(t) + Sum(weight[i] * (degree[i] - 2))) <= Cost(t*) and cost(minimum 1-tree) + Sum(weight[i] * (degree[i] - 2)) <= Cost(t*) and w <= Cost(t*) 5) because t* is also the tour minimizing Cost(t) with t in U (weights do not affect the optimality of a tour), Cost(t*) is the cost of the optimal solution to the TSP and w is a lower bound to this cost.
The best lower bound is the one for which weights maximize w. Intuitively as degrees get closer to 2 the minimum 1-trees gets closer to a tour.
At each iteration m, weights are therefore updated as follows: weight(m+1)[i] = weight(m)[i] + step(m) * (degree(m)[i] - 2) where degree(m)[i] is the degree of node i in the 1-tree at iteration i, step(m) is a subgradient optimization step.
This implementation uses two variants of Held-Karp's initial subgradient optimization iterative estimation approach described in "The traveling-salesman problem and minimum spanning trees: Part I and II", by Michael Held and Richard M. Karp, Operations Research Vol. 18, No. 6 (Nov. - Dec., 1970), pp. 1138-1162 and Mathematical Programming (1971).
The first variant comes from Volgenant, T., and Jonker, R. (1982), "A branch and bound algorithm for the symmetric traveling salesman problem based on the 1-tree relaxation", European Journal of Operational Research. 9:83-89.". It suggests using step(m) = (1.0 * (m - 1) * (2 * M - 5) / (2 * (M - 1))) * step1
- (m - 2) * step1
+ (0.5 * (m - 1) * (m - 2) / ((M - 1) * (M - 2))) * step1 where M is the maximum number of iterations and step1 is initially set to
L / (2 * number of nodes), where L is the un-weighed cost of the 1-tree; step1 is updated each time a better w is found. The intuition is to have a positive decreasing step which is equal to 0 after M iterations; Volgenant and Jonker suggest that: step(m) - 2 * step(m-1) + t(m-2) = constant, step(M) = 0 and step(1) - step(2) = 3 * (step(M-1) - step(M)). The step(m) formula above derives from this recursive formulation. This is the default algorithm used in this implementation.
The second variant comes from Held, M., Wolfe, P., and Crowder, H. P. (1974), "Validation of subgradient optimization", Mathematical Programming 6:62-88. It derives from the original Held-Karp formulation: step(m) = lambda(m) * (wlb - w(m)) / Sum((degree[i] - 2)^2), where wlb is a lower bound to max(w(m)) and lambda(m) in [0, 2]. Help-Karp prove that if w(m') > w(m) and 0 < step < 2 * (w(m') - w(m))/norm(degree(m) - 2)^2, then weight(m+1) is closer to w' than w from which they derive the above formula. Held-Wolfe-Crowder show that using an overestimate UB is as effective as using the underestimate wlb while UB is easier to compute. The resulting formula is: step(m) = lambda(m) * (UB - w(m)) / Sum((degree[i] - 2)^2), where UB is an upper bound to the TSP (here computed with the Christofides algorithm), and lambda(m) in [0, 2] initially set to 2. Held-Wolfe-Crowder suggest running the algorithm for M = 2 * number of nodes iterations, then dividing lambda and M by 2 until M is small enough (less than 2 in this implementation).
To speed up the computation, minimum spanning trees are actually computed on a graph limited to the nearest neighbors of each node. Valenzuela-Jones 1997 experiments have shown that this does not harm the lower bound computation significantly. At the end of the algorithm a last iteration is run on the complete graph to ensure the bound is correct (the cost of a minimum 1-tree on a partial graph is an upper bound to the one on a complete graph).
Usage: std::function<int64_t(int,int)> cost_function =...; const double lower_bound = ComputeOneTreeLowerBound(number_of_nodes, cost_function); where number_of_nodes is the number of nodes in the TSP and cost_function is a function returning the cost between two nodes.
Classes |
|
---|---|
HeldWolfeCrowderEvaluator | |
VolgenantJonkerEvaluator |