Computer Science/Discrete Mathematics Seminar II

Concentration on HDX: Derandomization Beyond Chernoff

Chernoff's bound states that for any $A \subset [N]$ the probability a random $k$-tuple $s \in {[N] \choose k}$ correctly `samples' $A$ (i.e. that the density of $A$ in $s$ is close to its mean) decays exponentially in the dimension $k$. In 1987, Ajtai, Komlos, and Szemeredi proved the "Expander-Chernoff Theorem", a powerful derandomization of Chernoff's bound stating that one can replace ${[N] \choose k}$ with the significantly sparser family $X_G(k) \subsetneq {[N] \choose k}$ of length-$k$ paths on an expander graph $G$ while maintaining essentially the same concentration. Their result, which allows amplification without significant blow-up in size or randomness, has since become a mainstay in theoretical computer science with breakthrough applications in derandomization, coding, pseudorandomness, cryptography, and complexity.

One natural way to view AKS is to say Expander-Walks are pseudorandom with respect to functions of their vertices, or against "degree 1" functions. In modern complexity, especially in the context of hardness amplification and PCPs, we often need concentration against higher degree functions, e.g. functions of edges or triples. Unfortunately, due to their inherent low-dimensionality, walks on expanders are not pseudorandom even at degree 2, and the construction of such a de-randomized object has remained largely open. 

In 2017 Dinur and Kaufman offered a partial resolution to this question in the introduction of (spectral) high dimensional expanders, a derandomized family satisfying Chebyshev-type (inverse polynomial) concentration for higher degree functions. Their work led to breakthrough applications in agreement testing and PCPs (Bafna and Minzer 2024), but left an exponential gap with known bounds for the complete hypergraph ${[N] \choose k}$ needed for other applications. In this talk, we close this gap and prove (strong enough) HDX indeed match the concentration of ${[N] \choose k}$. Time willing, we will additionally discuss the relation of these bounds to agreement testing and a powerful analytic inequality called reverse hypercontractivity.

Based on joint work with Yotam Dikstein.