Computer Science/Discrete Mathematics Seminar II

Fast learning requires good memory: a time-space lower bound for parity learning

We prove that any algorithm for learning parities requires either a memory of quadratic size or an exponential number of samples. This proves a recent conjecture of Steinhardt, Valiant and Wager and shows that for some learning problems a large storage space is crucial. More formally, in the problem of parity learning, an unknown string x{0,1}n was chosen uniformly at random. A learner tries to learn x from a stream of samples (a1,b1),(a2,b2),, where each at is uniformly distributed over {0,1}n and bt is the inner product of at and x, modulo 2. We show that any algorithm for parity learning, that uses less than n2/25 bits of memory, requires an exponential number of samples. Previously, there was no non-trivial lower bound on the number of samples needed, for any learning problem, even if the allowed memory size is O(n) (where n is the space needed to store one sample). We also give an application of our result in the field of bounded-storage cryptography. We show an encryption scheme that requires a private key of length n, as well as time complexity of n per encryption/decryption of each bit, and is provenly and unconditionally secure as long as the attacker uses less than n2/25 memory bits and the scheme is used at most an exponential number of times. Previous works on bounded-storage cryptography assumed that the memory size used by the attacker is at most linear in the time needed for encryption/decryption.

Date & Time

March 08, 2016 | 10:30am – 12:30pm

Location

S-101

Affiliation

Weizmann Institute of Science; Visiting Professor, School of Mathematics