* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Download Eigenvectors and Decision Making
Survey
Document related concepts
Rotation matrix wikipedia , lookup
Symmetric cone wikipedia , lookup
Four-vector wikipedia , lookup
Determinant wikipedia , lookup
Gaussian elimination wikipedia , lookup
Matrix (mathematics) wikipedia , lookup
Non-negative matrix factorization wikipedia , lookup
Matrix calculus wikipedia , lookup
Singular-value decomposition wikipedia , lookup
Jordan normal form wikipedia , lookup
Orthogonal matrix wikipedia , lookup
Eigenvalues and eigenvectors wikipedia , lookup
Matrix multiplication wikipedia , lookup
Transcript
Eigenvectors and Decision Making • Analytic Hierarchy Process (AHP) • Positive Symmetrically Reciprocal Matrices (PSRMs) and Transitive PSRMs • Estimation methods -- How close? Close to what? Close in what sense? Analytic Hierarchy Process “In the 1970’s the Egyptian government asked Tom Saaty, a pioneering mathematician with a fistful of awards, to help clarify the Middle East conflict. The Egyptians needed a coherent, analytical way to assess the pros and cons of their less than cozy strategic relationship with the Soviet Union. Saaty, a Wharton professor with a background in arms-control research, tackled the question with game theory.” Analytic Hierarchy Process “The Egyptian government was pleased with his work (and eventually did ask the Russians to leave), but Saaty himself wasn’t satisfied with the process. He felt his conclusion was incomplete – that important but intangible information was left out of the final equation because game theory was too rigid. ‘I couldn’t use it to solve a real-life problem.’” Analytic Hierarchy Process • • • • • • • Xerox (Software: Boeing Expert IBM Choice) Northrop Grumman US Steel Corning Governments of U.S., South Africa, Canada, and Indonesia Analytic Hierarchy Process Decision: Choose between a Toyota, Honda, or Citation (Chevrolet) Analytic Hierarchy Process One set of criteria: cost, dependability, size, and aesthetics First we want to assign numbers to these four criteria that represent their relative importance w1, w2, w3, w4 . Cost Depen Size Aesth Cost 1 2 3 3 Dependability 1/2 1 3 3 Size 1/3 1/3 1 1/2 Aesthetics 1/3 1/3 2 1 This matrix has maximum eigenvalue approximately 4.08 and eigenvector is (.44, .31, .10, .15). Analytic Hierarchy Process A n x n matrix x nx1 λ scaler Ax = λx (λI-A)x = 0 det(λI-A) = 0 Analytic Hierarchy Process 2 1 1 / 2 1 1 / 3 1 / 3 1 / 3 1 / 3 3 3 .44 .44 3 3 .31 . 31 4.08 .1 1 1 / 2 .1 2 1 .15 .15 Generate reciprocal matrices for each criterion, comparing the three types of cars pairwise on each criterion as shown below. For Dependability: Toy Hon Cit Toy 1 1 1/3 Hon 1 1 1/3 Cit 3 3 1 Weights .43 .43 .14 Toy Hon Cit Weights 1 1 1 2 1 1 For Cost: Toy Hon Cit 1 1 1/2 .40 .35 .25 For Size: Toy Toy Hon Cit 1 1 1/3 Hon Cit 3 1 1/3 1 3 1 Hon Cit 4 1 1/2 3 2 1 Weights .43 .14 .43 For Aesthetics: Toy Toy Hon Cit 1 1/4 1/3 Weights .63 .22 .15 The weight for each criterion is then multiplied by the weight for each car within that criterion, and added across criteria as shown below: Cost Depen Size Aesth (.44) (.31) (.10) (.15) Toy Hon Cit .40 .35 .25 .43 .43 .14 .43 .14 .43 .63 .33 .15 Composite Priorities .45 .33 .22 Positive Symmetrically Reciprocal Matrices n x n matrix A aij > 0 aji = 1/aij Transitive or consistent: aik = aijajk wi/wj = wi/wk x wk/wj (The entries aij represent the ratio wi/wj.) Cost Dependability Size Aesthetics w1/w1 w2/w1, w3/w1 w4/w1 Cost Depen w1/w1 w2/w1, w3/w1 w4/w1 w1/w2 w1/w3 w2/w2 w2/w3 w3/w2 w3/w3 w4/w2 w4/w3 w1/w2 w1/w3 w2/w2 w2/w3 w3/w2 w3/w3 w4/w2 w4/w3 w1/w4 w2/w4 w3/w4 w4/w4 Size w1 w2 w3 w4 Aesth w1/w4 w2/w4 w3/w4 w4/w4 = 4 w1 w2 w3 w4 Positive Symmetrically Reciprocal Matrices Theorem (Perron-Frobenius): (Let A be non-negative and irreducible.) (1) A has a real positive simple (not multiple) eigenvalue λmax (Perron root) which is not exceeded in modulus by any other eigenvalue of A. (2) The eigenvector of A corresponding to the eigenvalue λmax has positive components and is essentially unique (to within multiplication by a constant). Theorem: A positive, reciprocal matrix is consistent if and only if λmax = n. (Saaty. J of Mathematical Psychology, 1977). Positive Symmetrically Reciprocal Matrices Theorem: Let R be an n x n positive reciprocal matrix with entries 1/S < rij < S, 1 < i, j < n, for some S >1 and let λmax denote the largest eigenvalue of R in modulus, which is known to be real and positive. Then n < λmax < 1 + ½(n-1)(S + 1/S), the lower and upper bound being reached if and only if R is supertransitive or maximally intransitive, respectively. (Aupetit and Genest – European Journal of Operational Research, 1993): Positive Symmetrically Reciprocal Matrices Theorem:A transitive matrix is SR and has rank one. (Farkas, Rozsa, Stubnya, Linear Algebra Appl., 1999.) Close? 2 1 1 / 2 1 1 / 3 1 / 3 1 / 3 1 / 3 3 3 .44 .44 3 3 .31 . 31 4.08 .1 1 1 / 2 .1 2 1 .15 .15 Close? • Saaty: Limit of normalized row sums of powers of the matrix. k Ae • μ =(λmax-n)/(n-1) lim T k e Ae • If we consider inconsistencies in the powers of A, generated by cycles of length 1, 2, 3, … then the right eigenvector represents the dominance of one alternative over all others through these cycles. (Mathematics Magazine, 1987.) Close? • Logarithmic Least Square Method w i ln aij ln w j i, j n 2 • Least Squares wi a ij w i, j j n 2 • Another kind of eigenvector in another metric, the “max eigenvector” minimizes relative error (Elsner & vanden Driesscle, Linear Algebra and Its Applications, 2004). Finally Why do I care? What am I doing with these matrices and eigenvectors?