et K be a convex body in Rn. In some cases (say when K is a
cube), we can tile Rn using translates of K. However, in general
(say when K is a ball) this is impossible. Nevertheless, we show
that one can always cover space "smoothly" using translates...
Graph Crossing Number is a fundamental and extensively studied
problem with wide ranging applications. In this problem, the goal
is to draw an input graph G in the plane so as to minimize the
number of crossings between the images of its edges. The...
Determining the complexity of matrix multiplication is a
fundamental problem of theoretical computer science. It is
popularly conjectured that matrices can be multiplied in
nearly-linear time. If true, this conjecture would yield
similarly-fast...
A Locally Decodable Codes (LDC) is an error correcting code
which allows the decoding of a single message symbol from a few
queries to a possibly corrupted encoding. LDCs can be
constructed, for example, from low degree polynomials over a
finite...
Over a high-dimensional vector space Fnpover a fixed finite
field Fp, the inverse theorem for the Gowers norm asserts that a
bounded function f on this space that has a large Gowers Uk+1 norm,
must necessarily correlate with a phase polynomial e(P)...
A seminal result in learning theory characterizes the PAC
learnability of binary classes through the VC dimension. Extending
this characterization to the general multiclass setting has been
open since the late 1980s.
Our goal is to develop a generic framework for converting modern
gradient-descent based optimization algorithms into mechanisms
where inputs come from self-interested agents.
We focus on aggregating preferences from n players in a
context...
The online list labeling problem is a basic primitive in data
structures. The goal is to store a dynamically-changing set of n
items in an array of m slots, while keeping the elements in sorted
order. To do so, some items may need to be moved over...
The classic algorithm AdaBoost allows to convert a weak learner,
that is an algorithm that produces a hypothesis which is slightly
better than chance, into a strong learner, achieving arbitrarily
high accuracy when given enough training data. We...