In 1975 Goppa suggested a general method for constructing error
correcting codes based on algebraic geometry. In a long line of
research such codes were constructed, constituting as a precious
example of a construction that beats the probabilistic...
I will continue last week's discussion of this model, this time
giving sketches of proofs of some of the theorems. No technical
background is necessary for this lecture, but for motivation and
background it is good to consult the Tue Feb 11 talk...
The goal of computationally sound symbolic security is to create
formal systems of cryptography which have a sound interpretation
with respect to complexity-based notions of security. While there
has been much progress in the development of such...
The discrete Fourier transform is a widely used tool in the
analysis of Boolean functions. One can view the Fourier transform
of a Boolean function $f:\{0,1\}^n \to \{0,1\}$ as a distribution
over sets $S \subseteq [n]$. The Fourier-tail at level $k...
Reed-Muller codes encode an $m$-variate polynomial of degree $r$ by
evaluating it on all points in $\{0,1\}^m$. Its distance is
$2^{m-r}$ and so it cannot correct more than that many
errors/erasures in the worst case. For random errors one may
hope...
We characterize all complex-valued (Lebesgue) integrable functions
$f$ on $[0,1]^m$ such that $f$ vanishes when integrated over the
product of $m$ measurable sets which partition $[0,1]$ and have
prescribed Lebesgue measures $\alpha_1,\ldots,\alpha...
We prove an average-case depth hierarchy theorem for Boolean
circuits over the standard basis of AND, OR, and NOT gates. Our
hierarchy theorem says that for every $d \geq 2$, there is an
explicit $n$-variable Boolean function $f$, computed by a...
What is the distribution of the number of triangles in the random
graph $G(n, 1/2)$? It was known for a long time that this
distribution obeys a central limit theorem: from the point of view
of large intervals (~ standard-deviation length), the...
The Resolution proof system is perhaps the simplest and most
universally used in verification system and automated theorem
proving. It was introduced by Davis and Putnam in 1960. The study
of its efficiency, both in terms of proof length of natural...
Tensor decompositions have been the key algorithmic components in
provable learning of a wide range of hidden variable models such as
topic models, Gaussian mixture models, independent component
analysis, dictionary learning. Despite its success...