Explicit SoS Lower Bounds from High Dimensional Expanders
Where are the hard problems? In the absence of a proof of P ≠ NP, researchers have spent years proving unconditional lower bounds for constrained models of computation. In time, a distinct theme arose: random problems (in particular random combinatorial optimization problems) are typically hard—fooling even our most powerful algorithmic frameworks like the Sum-of-Squares Hierarchy. Despite this, finding such a hard problem explicitly has remained a major challenge: for over 20 years, no better method was known than brute force search. In this talk, I will describe recent joint work with Ting-Chun Lin resolving this problem: a construction of explicit and near-optimally hard `constraint satisfaction problems’ for Sum-of-Squares based on topological high dimensional expanders.
In more detail, in the first part of this talk I will give a brief `motivational’ introduction to Sum-of-Squares, review Grigoriev’s 1999 result proving hardness of random XOR-instances for SoS/refutation, and discuss its close connection with the ubiquitous notion of graph expansion. In the second part of the talk, building on work of Dinur, Filmus, Harsha, and Tulsiani (ITCS 2021) I will show how high dimensional expansion arises as a natural method of de-randomizing Grigoriev’s proof. Finally, I’ll discuss the existence of explicit constructions of high dimensional expanders, completing the proof.
No background in Sum-of-Squares or high dimensional expansion required!