* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Rishi B. Jethwa and Mayank Agarwal
Inverse problem wikipedia , lookup
Optogenetics wikipedia , lookup
Computational phylogenetics wikipedia , lookup
Expectation–maximization algorithm wikipedia , lookup
Probabilistic context-free grammar wikipedia , lookup
Factorization of polynomials over finite fields wikipedia , lookup
Computational electromagnetics wikipedia , lookup
Knapsack problem wikipedia , lookup
K-nearest neighbors algorithm wikipedia , lookup
Lateral computing wikipedia , lookup
Simplex algorithm wikipedia , lookup
Computational complexity theory wikipedia , lookup
Genetic algorithm wikipedia , lookup
Mathematical optimization wikipedia , lookup
Dynamic programming wikipedia , lookup
Time complexity wikipedia , lookup
Analysis of the Traveling Salesman Problem and current approaches for solving it. Rishi B. Jethwa and Mayank Agarwal. CSE Department. University of Texas at Arlington . Table of Contents. Introduction. Naïve and simple approaches. Approaches using Dynamic Programming. Approaches using Neural Network and Genetics. Approaches using Parallel Branch and Bound Technique. Traveling Salesman Problem Given N points, find the shortest circular path that links all the points together. The constraint on TSP is that the first and the last point the tour should be same and the edges cannot be repeated. Vertices can be repeated. No polynomial solution exists for this type of problems which has alas combinatory solutions. So we are interested in mainly good solutions, not exact. A naïve approach.. Suppose you are given with n points and you start with first point for second possibility you are left with n-1 options then n-2 and so on. Hence there are altogether !(n-1)/2 different paths i.e. 0.5 * !(n-1) in APL notations. For n = 36, there are 10*38 different paths. Other simple approaches. Nearest Neighbour. Start at an arbitrary point and successively visit the nearest unvisited point. After all the points have been visited, return to the start point. Other simple approaches. Minimum Spanning Tree. Construct the minimum spanning tree of the point set and duplicate all the links on the tree. Sequence the points as the would appear in a traversal of the doubled tree. Pass through the sequence and remove all representations after the first of each point. Other simple approaches. Strip. Partition the square into sqrroot N/3 vertical strips. Sequence the points in the each strip by the vertical position , alternately top-tobottom and bottom-to-top, and visit the strips from left to right. At last return to the starting point. Spacefilling Approach The approach goes like this. Have a square that has all the point inside. bisect it into two triangle, name it 0 and 1. Then again bisect each into two triangle and name them for 0, 00 and 01 and for 1, 10 and 11. Continue this partition and name each partition accordingly. At last traverse the small triangle for points in them. Comparisons NN MST STRIP Spacefilling Ease of coding Good Poor Good Good Memory O(N) O(N) O(N) O(N) poor Good Good Parallelizabilit Poor y To solve O(N2 ) O(N2 LogN ) O(N Log N) O(N Log N To modify Resolve Resolve O(Log N) O(Log N) Good Poor Good Performance on non-uniform data Good Diamond Method to solve TSP problems. Consider a big-diamond APL symbol, divide it into 4 parts. Approximately 1/4 points are located at each of the four quadrants. Apply the TSP to each of the four quadrants. A catenation of these four sub paths produces a path with no loop, which is what we wanted. Dynamic Programming treatment. Apart from the vertices it also defines d(i,j) be the distance between ith and jth vertices. Then the final answer would be: f(i;j1, j2, ....,jk) = min as long as 1 <= m <= k {d(i,jm) + f(i;j1, j2, ..,jm-1,jm+1,..,jk} first we get f(i; j1, j2) then f(i; j1, j2, j3) so on until we get f(i; j1, j2.....jn). The running time is reduced to n2 2n-1 and the space requirement is 6 times that of Striling’s formula i.e. 22m / sqrt(3.14 *m). Dynamic Programming treatment. The GTSP is linked to the fact that it is a combination of two problems. Once a vertex is chosen, we must choose which cluster to visit first and then which vertex to visit first. Once the cluster path is determined, we are confronted to the minimum cycle path problem. This paper proposes the genetic algorithm for choosing a path. Here sequence of chromosomes represents a path. The Self-Organizing Neural Network (SONN) The SONN is arranged with M neurons and if each neuron has N-1 dimensional weight vector. SONN will exhibit non-convergence in case when M=N. This paper proposed a new density function that guarantees the convergence even when M=N. Also for a problem with N cities original SONN requires 2N neurons but this approach requires only N neurons. Parallel Branch and Bound Technique. Uses parallel computers to solve large randomly generated ATSP. The principle components of the algorithm are as follows: i) Lower Bounding Technique:- To find the lower bounds for the parallel ATSP algorithm by solving the assignment problem. ii) Upper Bounding Heuristic:- Use the solution to the assignment problem to construct a solution to the ATSP. iii) Branching rules:- Create two or more new sub-problems based on an assignment problem solution. This algorithm is capable of using tens to hundreds of processors depending on the problem size and difficulty. Parallel Branch and Bound Technique. Edward discusses the issues surrounding implementation of a particular branch and bound algorithm for the TSP on a hypercube multi-computer. This paper uses the Best-First Search technique for branching implementation. TSP is one of the very few fully asynchronous applications that have been written on hypercube thus far. THANKS.