Previous Special Year Seminar

Mar
10
2020

Theoretical Machine Learning Seminar

Your Brain on Energy-Based Models: Applying and Scaling EBMs to Problems of Interest to the Machine Learning Community Today
Will Grathwohl
12:00pm|Dilworth Room

In this talk, I will discuss my two recent works on Energy-Based Models. In the first work, I discuss how we can reinterpret standard classification architectures as class conditional energy-based models and train them using recently proposed...

Mar
05
2020

Theoretical Machine Learning Seminar

Understanding Deep Neural Networks: From Generalization to Interpretability
Gitta Kutyniok
12:00pm|Dilworth Room

Deep neural networks have recently seen an impressive comeback with applications both in the public sector and the sciences. However, despite their outstanding success, a comprehensive theoretical foundation of deep neural networks is still missing...

Mar
03
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|White-Levy

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...

Feb
27
2020

Theoretical Machine Learning Seminar

Preference Modeling with Context-Dependent Salient Features
12:00pm|Dilworth Room

This talk considers the preference modeling problem and addresses the fact that pairwise comparison data often reflects irrational choice, e.g. intransitivity. Our key observation is that two items compared in isolation from other items may be...

Feb
25
2020

Theoretical Machine Learning Seminar

Learning from Multiple Biased Sources
Clayton Scott
12:00pm|Dilworth Room

When high-quality labeled training data are unavailable, an alternative is to learn from training sources that are biased in some way. This talk will cover my group’s recent work on three problems where a learner has access to multiple biased...

Feb
20
2020

Theoretical Machine Learning Seminar

Geometric deep learning for functional protein design
Michael Bronstein
12:00pm|Dilworth Room

Protein-based drugs are becoming some of the most important drugs of the XXI century. The typical mechanism of action of these drugs is a strong protein-protein interaction (PPI) between surfaces with complementary geometry and chemistry. Over the...

Feb
13
2020

Theoretical Machine Learning Seminar

The Lottery Ticket Hypothesis: On Sparse, Trainable Neural Networks
Jonathan Frankle
12:00pm|Dilworth Room

We recently proposed the "Lottery Ticket Hypothesis," which conjectures that the dense neural networks we typically train have much smaller subnetworks capable of training in isolation to the same accuracy starting from the original initialization...

Feb
11
2020

Theoretical Machine Learning Seminar

Geometric Insights into the convergence of Non-linear TD Learning
12:00pm|Dilworth Room

While there are convergence guarantees for temporal difference (TD) learning when using linear function approximators, the situation for nonlinear models is far less understood, and divergent examples are known. We take a first step towards...

Feb
06
2020

PCTS Seminar Series: Deep Learning for Physics

Topic #1: Understanding Machine Learning via Exactly Solvable Statistical Physics Models; Topic #2: Dynamics of Generalization in Overparameterized Neural Networks
Speaker #1: Lenka Zdeborova; Speaker #2: Andrew Saxe
11:45am|Jadwin Hall, PCTS Seminar Room 407, 4th Floor

Please Note: The seminars are not open to the general public, but only to active researchers. Register here for this event: https://docs.google.com/forms/d/e/1FAIpQLScJ-BUVgJod6NGrreI26pedg8wGEyP… Abstract for talk #1: The affinity between...

Feb
04
2020

Theoretical Machine Learning Seminar

Algorithm and Hardness for Kernel Matrices in Numerical Linear Algebra and Machine Learning
12:00pm|Dilworth Room

For a function K : R^d x R^d -> R, and a set P = {x_1, ..., x_n} in d-dimension, the K graph G_P of P is the complete graph on n nodes where the weight between nodes i and j is given by K(x_i, x_j). In this paper, we initiate the study of when...