Abstract
A family of random matrices $\boldsymbol{X}^N=(X_1^N,\ldots,X_d^N)$ is said to converge strongly to a family of bounded operators $\boldsymbol{x}=(x_1,\ldots,x_d)$ when $\|P(\boldsymbol{X}^N,\boldsymbol{X}^{N*})\|\to\|P(\boldsymbol{x}, \boldsymbol{x}^*)\|$ for every noncommutative polynomial $P$. This phenomenon plays a key role in several recent breakthroughs on random graphs, geometry, and operator algebras. However, proofs of strong convergence are notoriously delicate and haverelied largely on problem-specific methods.
In this paper, we develop a new approach to strong convergence that uses only soft arguments. Our method exploits the fact that for many natural models, the expected trace of $P(\boldsymbol{X}^N,\boldsymbol{X}^{N*})$ is a rational function of $\frac{1}{N}$ whose lowest order asymptotics are easily understood. We develop a
general technique to deduce strong convergence directly from these inputs using the inequality of A. and V. Markov for univariate polynomials and elementary Fourier analysis.
To illustrate the method, we develop the following applications.
- We give a short proof of the result of Friedman that random regular graphs have a near-optimal spectral gap,and obtain a sharp understanding of the large deviations probabilities of the second eigenvalue.
- We prove a strong quantitative form of the strong convergence property of random permutation matrices due to Bordenave and Collins.
- We extend the above to any stable representation of the symmetric group, providing many new examples of the strong convergence phenomenon.