A common way for lower bounding the expansion of a graph is by
looking the second smallest eigenvalue of its Laplacian matrix.
Also known as the easy direction of Cheeger's inequality, this
bound becomes too weak when the expansion is o(1). In 2004...
Deep learning, a modern version of neural nets, is increasingly
seen as a promising way to implement AI tasks such as speech
recognition and image recognition. Most current algorithms are
heuristic and have no provable guarantees. This talk will...
Introduction, Nathan Seiberg, Professor in the
School of Natural Sciences, Institute for Advanced Study
Remarks, James Wolfensohn, Trustee Chairman
Emeritus, Institute for Advanced Study