Survey
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project
Ramanujan Graphs from Finite Free Convolutions Nikhil Srivastava UC Berkeley joint with Adam Marcus (Princeton) and Daniel Spielman (Yale) Expander Graphs An expander is a sparse well-connected graph. Every set of vertices has many neighbors. Random walks mix quickly. Large spectral gap No bi-Lipschitz embedding into normed spaces. Etc. Expanders Non-Expanders Spectral Expanders Let G be a graph and A be its adjacency matrix a b e c d 0 1 0 0 1 1 0 1 0 1 eigenvalues π1 β₯ π2 β₯ β― ππ 0 1 0 1 0 0 0 1 0 1 1 1 0 1 0 Spectral Expanders Let G be a graph and A be its adjacency matrix a b e c d 0 1 0 0 1 1 0 1 0 1 0 1 0 1 0 0 0 1 0 1 1 1 0 1 0 eigenvalues π1 β₯ π2 β₯ β― ππ If d-regular, then π΄π = ππ so If bipartite then eigs are symmetric about zero so π1 = π ππ = βπ βtrivialβ Spectral Expanders Let G be a graph and A be its adjacency matrix a 0 b 1 G is connected iff π02 e c 0 1 d 1 0 0 0 1 0 π΄1 <0 π 1 0 1 0 1 0 1 1 interesting 1 0 1 0 eigenvalues π1 β₯ π2 β₯ β― ππ If d-regular, then π΄π = ππ so If bipartite then eigs are symmetric about zero so π1 = π ππ = βπ βtrivialβ Spectral Expanders Definition: G is a good expander if all non-trivial eigenvalues βͺ π [ -d 0 ] d (this implies all of the other connectivity and pseudorandomness properties) Spectral Expanders Definition: G is a good expander if all non-trivial eigenvalues βͺ π [ ] 0 d -d e.g. πΎπ and πΎπ,π have all nontrivial eigs -1,0. Spectral Expanders Definition: G is a good expander if all non-trivial eigenvalues βͺ π [ ] 0 d -d e.g. πΎπ and πΎπ,π have all nontrivial eigs 0. Goal: construct infinite families, π fixed, π β β. Spectral Expanders Definition: G is a good expander if all non-trivial eigenvalues βͺ π [ ] 0 d -d e.g. πΎπ and πΎπ,π have all nontrivial eigs 0. Goal: construct infinite families, π fixed, π β β. Question: What are the best expanders, for fixed π? Spectral Expanders Definition: G is a good expander if all non-trivial eigenvalues βͺ π [ ] 0 d -d e.g. πΎπ and πΎπ,π have all nontrivial eigs 0. Goal: construct infinite families, π fixed, π β β. Alon-Boppanaβ86: Canβt beat [β2 π β 1, 2 π β 1] The meaning of 2 π β 1 The infinite d-ary tree π π΄ π = [β2 π β 1, 2 π β 1] (Cayley graph of the free group with π/2 generators) The meaning of 2 π β 1 The infinite d-ary tree π π΄ π = [β2 π β 1, 2 π β 1] Alon-Boppanaβ86: This is the best possible spectral expander. Ramanujan Graphs: Definition: G is Ramanujan if all non-trivial eigs have absolute value at most 2 π β 1 [ -d [ -2 π β 1 0 ] 2 πβ1 ] d Ramanujan Graphs: Definition: G is Ramanujan if all non-trivial eigs have absolute value at most 2 π β 1 [ -d [ -2 π β 1 0 ] 2 πβ1 ] d Margulis, Lubotzky-Phillips-Sarnakβ88: Infinite sequences of Ramanujan graphs exist for π = π + 1 (Cayley graphs, analysis based on number theory) Ramanujan Graphs: Definition: G is Ramanujan if all non-trivial eigs have absolute value at most 2 π β 1 [ -d [ -2 π β 1 0 ] 2 πβ1 ] d Margulis, Lubotzky-Phillips-Sarnakβ88: Infinite sequences of Ramanujan graphs exist for π = π + 1 Friedmanβ08: A random d-regular graph is almost Ramanujan : 2 π β 1 + π(1) More Recently [MSSβ13]. Infinite families of bipartite Ramanujan graphs exist for every π β₯ 3. New Theorem [MSSβ13]. Infinite families of bipartite Ramanujan graphs exist for every π β₯ 3. [MSSβ15]. For every even π and π β₯ 3, there is a d-regular bipartite Ramanujan (multi)graph. New Theorem [MSSβ13]. Infinite families of bipartite Ramanujan graphs exist for every π β₯ 3. [MSSβ15]. For every even π and π β₯ 3, there is a d-regular bipartite Ramanujan (multi)graph. This Talk: Suppose G is a union of d random perfect matchings on n vertices. Then, with nonzero probability π2 π΄πΊ β€ 2 π β 1 Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. π΄= π1 ππ1π + π2 ππ2π Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Theorem: With nonzero probability: π2 π΄ < 2 π β 1 Eigs of a random 5-regular graph Eigs of a random 5-regular graph Eigs of a random 5-regular graph Limiting Spectral Distribution Kesten-McKay Law [Mckayβ81] Let π΄π be a sequence of random π βregular graphs. Then the spectral distributions of the π΄π converge weakly to the spectrum of ππ : ππΎπ π₯ = π 4 πβ1 βπ₯ 2 2π(π 2 βπ₯ 2) on [β2 π β 1, 2 π β 1] infinite "π΄β " 2 πβ1 [Mckayβ81] π΄π finite infinite "π΄β " 2 πβ1 [Mckayβ81] π΄π ? finite infinite "π΄β " 2 πβ1 [Mckayβ81] π΄π No control on π2: convergence in distribution oblivious to extreme eigs. finite infinite "π΄β " 2 πβ1 πΌπ π΄π = πΌ det π₯πΌ β π΄π π΄π expected characteristic polynomial finite Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Traditional Approaches: 1. Moments of Eigenvalues πΌππ(π΄π ). 2. Quadratic form πΌ sup π₯ π π΄π₯ Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Let π π΄ = det(π₯πΌ β π΄) be the characteristic poly. We are interested in π2 π΄ = π2 (π π΄ ). Random Graph Model Let π΄ = π1 ππ1π + π2 ππ2π + β― + ππ ππππ For random permutations π1 , β¦ , ππ β ππ¦ππ . π = adjacency matrix of a fixed perfect matching on π vertices. Let π π΄ = det(π₯πΌ β π΄) be the characteristic poly. We are interested in π2 π΄ = π2 (π π΄ ). Consider: β1 π π₯ πβπ πΌππ (π΄) πΌπ π΄ = πΌ det π₯πΌ β π΄ = π Outline of the Proof 1. Prove that πΌπ(π΄) has real roots. 2. Prove that the roots are Ramanujan: π2 πΌπ π΄ β€ 2 π β 1 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with nonzero probability. 1. Prove that πΌπ(π΄) has real roots. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. (determinant is a low degree degree poly in entries of Q) 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. For a polynomial π of degree n 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = (π₯ β π)πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. π = π orthogonal to 1 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. For any doubly stochastic π: π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 where π1 , β¦ , ππ are independent random (Haar) orthogonal matrices conditioned on ππ π = π. For the rest of the talk I will ignore the trivial eigenspace π and the trivial root/eigenvalue π. So ππ is now the largest root. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 this only depends on π(π)! 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 <definition on board> 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π ππ ππππ = πΌπ π πΌπ π π=1 ππ ππππ π=1 Linearization Formula: π ππ ππππ = π π βπ π π βπ β¦ βπ π(π) πΌπ π π=1 where βπ is the Finite Free Convolution. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π π π= π₯ β 1 ππ ππππ = πΌπ π πΌπ π π=1 π β1 2 π₯+1 π 2 ππ ππππ π=1 Linearization Formula: π ππ ππππ = π π βπ π π βπ β¦ βπ π(π) πΌπ π π=1 where βπ is the Finite Free Convolution. 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π π π= π₯ β 1 ππ ππππ = πΌπ π πΌπ π π=1 π β1 2 π₯+1 π 2 ππ ππππ π=1 Linearization Formula: π ππ ππππ = π π βπ π π βπ β¦ βπ π(π) πΌπ π π=1 where βπ is the Finite Free Convolution. [Walshβ22]: The operation βπ preserves real-rootedness. ο 1. Prove that πΌπ(π΄) has real roots. Quadrature Theorem. π π π π= π₯ β 1 ππ ππππ = πΌπ π πΌπ π π=1 π β1 2 π₯+1 π 2 ππ ππππ π=1 Linearization Formula: π ππ ππππ = π π βπ π π βπ β¦ βπ π(π) πΌπ π π=1 where βπ is the Finite Free Convolution. [Walshβ22]: The operation βπ preserves real-rootedness. Outline of the Proof 1. Prove that πΌπ(π΄) has real roots. Also showed: πΌπ π π π ππ π=1 π π = π π βπ β¦ βπ π π . 2. Prove that the roots are Ramanujan: π2 πΌπ π΄ β€ 2 π β 1 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with positive probability. ο Outline of the Proof 2. Prove that the roots are Ramanujan: π2 π π βπ β¦ βπ π π β€ 2 π β 1 Outline of the Proof 2. Prove that the roots are Ramanujan: π2 π π βπ β¦ βπ π π β€ 2 π β 1 Easy bound: π2 π βπ π β€ π2 π + π2 (π) <proof on board> Outline of the Proof 2. Prove that the roots are Ramanujan: π2 π π βπ β¦ βπ π π β€ 2 π β 1 Easy bound: π2 π βπ π β€ π2 π + π2 (π) Gives π2 π π βπ β¦ π π β€ π, trivial Free Probability Detour Sums of independent random matrices Independent Hermitian random matrices π΄, π΅. Q: What is the distribution of πππ π΄ + π΅ ? No general method. Depends on eigenvectors of π΄ and π΅. Free Probability Independent orthogonally invariant matrices π΄, π΅. (π΄ and ππ΄ππ have same distribution) Intuition: eigenvectors in `generic positionβ, so maybe can say something about π΄ + π΅... Free Probability Independent orthogonally invariant matrices π΄, π΅. Moment generating function 1 πΌ: = ππ΄ π§ = πΌππ π§πΌ β π΄ π β1 1 = π π πΌππ π΄π π§ π+1 a.k.a. Cauchy / Stieltjes transform Free Probability Independent orthogonally invariant matrices π΄, π΅. Moment generating function 1 πΌ: = ππ΄ π§ = πΌππ π§πΌ β π΄ π β1 Cumulant generating function: functional inverse (β1) π§ = ππ΄ (πΌ) Free Probability Independent orthogonally invariant matrices π΄, π΅. Moment generating function 1 πΌ: = ππ΄ π§ = πΌππ π§πΌ β π΄ π β1 Cumulant generating function: functional inverse (β1) π§ = ππ΄ (πΌ) [Voiculescuβ91] As π β β: (β1) (β1) (β1) ππ΄+π΅ πΌ = ππ΄ πΌ + ππ΅ πΌ β 1/πΌ Free Probability [Voiculescuβ91] As π β β: (β1) (β1) (β1) ππ΄+π΅ πΌ = ππ΄ πΌ + ππ΅ πΌ β 1/πΌ Summary: Precise description of limiting spectral distribution of sums of random matrices in generic position. Beats triangle inequality. Free Probability [Voiculescuβ91] As π β β: (β1) (β1) (β1) ππ΄+π΅ πΌ = ππ΄ πΌ + ππ΅ πΌ β 1/πΌ Summary: Precise description of limiting spectral distribution of sums of random matrices in generic position. Beats triangle inequality. Can be used to show: ππ + ππ ππ πππ Free Probability [Voiculescuβ91] As π β β: (β1) (β1) (β1) ππ΄+π΅ πΌ = ππ΄ πΌ + ππ΅ πΌ β 1/πΌ Summary: Precise description of limiting spectral distribution of sums of random matrices in generic position. Beats triangle inequality. Can be used to show: ππ + ππ ππ πππ but we want extreme eigs in finite dimensionsβ¦. Back to the proof: Finite R-Transform Inequalities Free Convolution for Polynomials Main tool: Stieltjes/Cauchy Transform For a real-rooted poly of degree π, consider: 1 πβ²(π₯) 1 ππ π₯ = = π π(π₯) π π 1 π₯ β ππ Define for πΌ > 0: πΌmax π = max{π₯: ππ π₯ = πΌ} Notice πΌmax p > π1 (π) Free Convolution for Polynomials Main tool: Stieltjes/Cauchy Transform For a real-rooted poly of degree π, consider: 1 πβ²(π₯) 1 ππ π₯ = = π π(π₯) π π 1 π₯ β ππ Define for πΌ > 0: πΌmax π = max{π₯: ππ π₯ = πΌ} Notice πΌmax p > π1 (π) A picture of πΌπππ₯ 1 ππ π₯ = π π 1 π₯ β ππ πΌ πΌπππ₯ For a single matching π π = π₯+1 So ππ π π 2 π₯β1 π 2 π₯ π₯ = 2 π₯ β1 and πΌmax π π 1 + 1 + 4πΌ 2 = 2πΌ For d matchings Main inequality: πΌmax π βπ π β€ πΌmax π + πΌmax π β 1/πΌ Meaning: When πΌ = β same as triangle inequality. When πΌ < β can beat the triangle inequality when roots are spread out. For d matchings Main inequality: πΌmax π βπ π β€ πΌmax π + πΌmax π β 1/πΌ Meaning: When πΌ = β same as triangle inequality. When πΌ < β can beat the triangle inequality when roots are spread out. Proof by characterizing extremizers, relies on convexity. For d matchings Main inequality: πΌmax π βπ π β€ πΌmax π + πΌmax π β 1/πΌ Cf. Voiculescu πΌmax π β π = πΌmax π + πΌmax π β 1/πΌ (asymptotically, for large πΌ) For d matchings Main inequality: πΌmax π βπ π β€ πΌmax π + πΌmax π β 1/πΌ Applying π β 1 times: πΌmax π π β β― β π π 1 + 1 + 4πΌ 2 π β 1 β€π β 2πΌ πΌ For d matchings Main inequality: πΌmax π βπ π β€ πΌmax π + πΌmax π β 1/πΌ Applying π β 1 times: πΌmax π π β β― β π π 1 + 1 + 4πΌ 2 π β 1 β€π β 2πΌ πΌ = 2 π β 1 for πΌ = πβ1 πβ2 infinite "π΄β " Finite analogue of Free Probability. πΌπ π΄π = πΌ det π₯πΌ β π΄π π΄π expected characteristic polynomial finite Outline of the Proof 1. Prove that πΌπ(π΄) has real roots. Also showed: πΌπ π π π ππ π=1 π π = π π βπ β¦ βπ π π . 2. Prove that the roots are Ramanujan: π2 πΌπ π΄ β€ 2 π β 1 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with positive probability. ο ο infinite "π΄β " Finite analogue of Free Probability. πΌπ π΄π = πΌ det π₯πΌ β π΄π expected characteristic polynomial ? π΄π finite Outline of the Proof 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with positive probability. Outline of the Proof 3. Prove: There is some setting of π1 , β¦ , ππ s.t: π π ππ΄ ) with positive probability. π 3. Proveππ2 2 π π΄ β€ π2π(πΌπ ππ β€ π πΌπ π ππ π 2 π π π π π Averaging Polynomials Basic Question: Given when are the roots of the related to roots of ? Averaging Polynomials Basic Question: Given when are the roots of the related to roots of ? Answer: Certainly not always Averaging Polynomials Basic Question: Given when are the roots of the related to roots of ? Answer: Certainly not alwaysβ¦ Averaging Polynomials Basic Question: Given when are the roots of the related to roots of ? But sometimes roots (avg) = avg (roots): A Sufficient Condition Basic Question: Given when are the roots of the related to roots of ? Answer: When they have a common interlacing. Definition. interlaces if Theorem. If interlacing, monic have a common Theorem. If interlacing, Proof. monic have a common Theorem. If interlacing, Proof. monic have a common Theorem. If interlacing, Proof. monic have a common Theorem. If interlacing, Proof. monic have a common Theorem. If interlacing, Proof. monic have a common Theorem. If interlacing, Proof. monic have a common Many Swaps Generate random permutations using swaps. Swaps generate polynomials with common interlacings. Apply theorem inductively. πΌπ1 πΌπ2 β¦ πΌππ π π΄ + ππ β¦ π1 π΅π1π β¦ πππ Conclusion. There is some setting of π π11, β¦ , π πππ1 ,s.t: β¦ , π1π , β¦ , πππ s.t: π π2 π π π π π π π π π ππ β¦π2π1πππ1 πβ¦ ππππ π ππ π π β€ π2 πΌπ π π π π π π π πππ ππ β¦ ππ 1 ππ1 π π β¦ ππ Outline of the Proof 1. Prove that πΌπ(π΄) has real roots. Also showed: πΌπ π π π ππ π=1 π π = π π βπ β¦ βπ π π . ο 2. Prove that the roots are Ramanujan: π2 πΌπ π΄ β€ 2 π β 1 ο 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with positive probability. ο Outline of the Proof Quadrature + Walshβ22 1. Prove that πΌπ(π΄) has real roots. Also showed: πΌπ π π π ππ π=1 π π = π π βπ β¦ βπ π π . Finite Analogue of 2. Prove that the roots are Ramanujan: Voiculescuβ91 π2 πΌπ π΄ β€ 2 π β 1 Interlacing 3. Prove π2 π΄ β€ π2 (πΌπ π΄ ) with positive probability. ο ο ο infinite "π΄β " π 4 π β 1 β π₯2 2π(π 2 β π₯ 2 ) Finite analogue of Free Probability. πΌπ π΄π = πΌ det π₯πΌ β π΄π expected characteristic polynomial Interlacing ? Families π΄π finite Is this a coincidence? [Marcusβ15]: If π and π are discrete measures, each uniform on π real points with characteristic polynomials π and π, then Finite convolution β1 lim πΆππ β ππ ππ πββ πΌ = β1 πΆπβπ (πΌ) For all sufficiently large real πΌ. (exact statement is a bit more technical) Similar results for S-transform. Also: Free CLT. Voiculescu convolution Is this a coincidence? [Marcusβ15]: If π and π are discrete measures, each uniform on π real points with characteristic polynomials π and π, then Finite convolution β1 lim πΆππ β ππ ππ πββ πΌ = β1 πΆπβπ (πΌ) For all sufficiently large real πΌ. (exact statement is a bit more technical) Similar results for S-transform. Also: Free CLT. Voiculescu convolution High-level question Why do expected characteristic polynomials give sharp bounds? (also: 2-Covers, Kadison-Singer, Thin Trees [Oveis-Gharan-Anariβ13]) What are expected characteristic polynomials? High-level question Why do expected characteristic polynomials give sharp bounds? (also: 2-Covers, Kadison-Singer, Thin Trees [Anari-Gharanβ15]) What are expected characteristic polynomials? Heuristic Observation + Marcusβ15: Finite approximations of asymptotic limits of finite random matrices. Open questions β’ More finite free probability notions: entropy β’ Nonbipartite Ramanujan graphs β’ Other random graph models β’ Algorithms β’ Bounds on probability (conjectured to be 52% by Novikov, Miller, Sabelliβ06) β’ More connections between infinite and finite β’ Applications to free probability/vN algebras?