Overview

This post examines Slepian’s Lemma (1962), a fundamental comparison principle in the theory of Gaussian processes. It formalizes the intuition that the more independent a set of Gaussian variables is, the larger their expected maximum will be. We explore the theorem’s rigorous statement, its generalization via the Sudakov-Fernique Inequality, and provide a detailed proof using the Gaussian Interpolation Method.

🏷️ The Intuition: Correlation and Supremum

Consider a collection of Gaussian random variables . We are interested in the behavior of the supremum .

The Correlation Effect

  • i.i.d. Case: If the variables are independent and , the expected maximum scales as .
  • Perfect Correlation: If , they move as a single unit, and .

Slepian’s insight was that positive correlation between variables “pulls” them together, effectively reducing the effective volume they cover and thus shrinking the expected maximum.

🏷️ Slepian’s Inequality

Slepian’s Lemma (Slepian, 1962) provides a formal comparison between two Gaussian processes based on their covariance structures.

Slepian’s Lemma (1962)

Let and be centered Gaussian random vectors in such that for all :

and for all :

Then for any real numbers :

In particular, this implies .

🏷️ Detailed Proof: The Interpolation Method

The most powerful proof of Slepian-type inequalities relies on Gaussian Interpolation, a technique that allows us to continuously deform one process into another while tracking the evolution of the expectation.

🏷️ Generalization: Sudakov-Fernique Inequality

A significant limitation of Slepian’s original result is the requirement of equal variances. The Sudakov-Fernique Inequality [@fernique1975regularite] generalizes this to compare increments directly.

Sudakov-Fernique Inequality

Let and be centered Gaussian processes such that for all :

Then .

Proof Insight: Variational Reformulation

The proof of Sudakov-Fernique follows the same interpolation logic but handles the variance terms by recognizing that:

Substituting this into the formula reveals that the increase in incremental variance compensates for any individual variance growth, maintaining the non-positivity of the derivative for the supremum functional.

🌊 Advanced Application: Random Matrix Theory

The Sudakov-Fernique inequality provides a remarkably simple proof for bounding the expected operator norm of a Gaussian random matrix.

Expected Operator Norm of a Gaussian Matrix

Let be an matrix with i.i.d. entries. The operator norm is:

1. The -process: Define . The increments are .

2. The -process: Let and be independent Gaussian vectors. Define . The increments are .

3. Comparison: By Sudakov-Fernique, .

📉 Advanced Application: Sudakov Minoration

While on Dudley’s Theorem provides an upper bound via metric entropy, Slepian’s logic allows us to establish a Lower Bound.

Sudakov Minoration

If contains points that are at least -separated in the Gaussian metric , we can compare the process to independent Gaussians with variance .

Significance: This result (Sudakov, 1971) is the dual to Dudley’s integral. It proves that if a set has high metric entropy at a single scale, the process MUST fluctuate significantly. This is refined by Talagrand’s functional, which “interpolates” between Sudakov and Dudley to achieve sharpness.

📝 Notes

  • Gordon’s Inequality: A further refinement by Yehoram Gordon (Gordon, 1985) compares the expected value of , which is essential for studying the smallest singular values of random matrices.
  • The Max Functionality: The key to Slepian’s Lemma is that the “max” function is sub-modular (the off-diagonal second derivatives are ). This ensures that increasing correlations (moving variables closer) always reduces the expected maximum.

🔗 See Also

📚 References

🐻  Gordon, Y. 1985. Some inequalities for Gaussian processes and applications. Israel Journal of Mathematics 50(4), 265–289.
🐻  Slepian, D.S. 1962. The one-sided barrier problem for Gaussian noise. Bell System Technical Journal 41(2), 463–501.
🐻  Sudakov, V.N. 1971. Gaussian random processes and measures of solid angles in Hilbert space. Doklady Akademii Nauk SSSR 197(1), 43–45.