Seminars Sorted by Series

Theoretical Machine Learning Seminar

Oct
01
2018

Theoretical Machine Learning Seminar

Structured Learning with Parsimony in Measurements and Computations: Theory, Algorithms, and Applications
Xingguo Li
12:15pm|White Levy Room

In modern “Big Data” applications, structured learning is the most widely employed methodology. Within this paradigm, the fundamental challenge lies in developing practical, effective algorithmic inference methods. Often (e.g., deep learning)...

Oct
15
2018

Theoretical Machine Learning Seminar

On the Dynamics of Gradient Descent for Training Deep Neural Networks
Wei Hu
12:15pm|White Levy Room

Deep learning builds upon the mysterious ability of gradient-based methods to solve related non-convex optimization problems. However, a complete theoretical understanding is missing even in the simpler setting of training a deep linear neural...

Oct
22
2018

Theoretical Machine Learning Seminar

Learning in Non-convex Games with an Optimization Oracle.
Alon Gonen
12:15pm|Princeton University, CS 302

We consider adversarial online learning in a non-convex setting under the assumption that the learner has an access to an offline optimization oracle. In the most general unstructured setting of prediction with expert advice, Hazan and Koren (2016)...

Nov
05
2018

Theoretical Machine Learning Seminar

Scalable natural gradient training of neural networks
12:15pm|Princeton University, CS 302

Natural gradient descent holds the potential to speed up training of neural networks by correcting for the problem geometry and achieving desirable invariance properties. I’ll present Kronecker-Factored Approximate Curvature (K-FAC), a scalable...

Nov
12
2018

Theoretical Machine Learning Seminar

Generalized Framework for Nonlinear Acceleration
Damien Scieur
12:15pm|White Levy Room

Nonlinear Acceleration Algorithms, such as BFGS, were widely used in optimization due to their impressive performance even for large scale problems. However, these methods present a non negligeable number of drawbacks, such as a strong lack of...

Nov
19
2018

Theoretical Machine Learning Seminar

Prediction with a Short Memory
Sham Kakade
12:15pm|White Levy Room

We consider the problem of predicting the next observation given a sequence of past observations, and consider the extent to which accurate prediction requires complex algorithms that explicitly leverage long-range dependencies. Perhaps surprisingly...

Nov
26
2018

Theoretical Machine Learning Seminar

A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors
Nikunj Saunshi
12:15pm|Princeton University, CS 302

Motivations like domain adaptation, transfer learning, and feature learning have fueled interest in inducing embeddings for rare or unseen words, n-grams, synsets, and other textual features. This paper introduces a la carte embedding, a simple and...

Dec
10
2018

Theoretical Machine Learning Seminar

On Expressiveness and Optimization in Deep Learning
12:15pm|White Levy Room

Understanding deep learning calls for addressing three fundamental questions: expressiveness, optimization and generalization. Expressiveness refers to the ability of compactly sized deep neural networks to represent functions capable of solving...

Feb
11
2019

Theoretical Machine Learning Seminar

Online Control with Adversarial Disturbances
Naman Agarwal
12:15pm|White Levy Room

We study the control of a linear dynamical system with adversarial disturbances (as opposed to statistical noise). The objective we consider is one of regret: we desire an online control procedure that can do nearly as well as that of a procedure...

Feb
13
2019

Theoretical Machine Learning Seminar

Rahul Kidambi
12:15pm|Princeton University, CS 302

The current era of large scale machine learning powered by Deep Learning methods has brought about tremendous advances, driven by the lightweight Stochastic Gradient Descent (SGD) method. Despite relying on a simple algorithmic primitive, this era...

Feb
18
2019

Theoretical Machine Learning Seminar

Curiosity, Intrinsic Motivation, and Provably Efficient Maximum Entropy Exploration
Karan Singh
12:15pm|Princeton University, CS 302

Suppose an agent is in an unknown Markov environment in the absence of a reward signal, what might we hope that an agent can efficiently learn to do? One natural, intrinsically defined, objective problem is for the agent to learn a policy which...

Mar
04
2019

Theoretical Machine Learning Seminar

FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
Will Grathwohl
12:15pm|Princeton University, CS 302

A promising class of generative models maps points from a simple distribution to a complex distribution through an invertible neural network. Likelihood-based training of these models requires restricting their architectures to allow cheap...

Mar
06
2019

Theoretical Machine Learning Seminar

Exponentiated Gradient Meets Gradient Descent
Will Grathwohl
1:30pm|Princeton University, CS 302

The (stochastic) gradient descent and the multiplicative update method are probably the most popular algorithms in machine learning. We introduce and study a new regularization which provides a unification of the additive and multiplicative updates...

Mar
11
2019

Theoretical Machine Learning Seminar

A Theoretical Analysis of Contrastive Unsupervised Representation Learning
Orestis Plevrakis
1:15pm|White Levy Room

Recent empirical works have successfully used unlabeled data to learn feature representations that are broadly useful in downstream classification tasks. Several of these methods are reminiscent of the well-known word2vec embedding algorithm...

Apr
08
2019

Theoretical Machine Learning Seminar

Fast minimization of structured convex quartics
Brian Bullins
12:15pm|White Levy Room

Recent progress in optimization theory has shown how we may harness second-order, i.e. Hessian, information, for achieving faster rates for both convex and non-convex optimization problems. Given this observation, it is then natural to ask what sort...

Oct
02
2019

Theoretical Machine Learning Seminar

Rethinking Control
Elad Hazan
12:00pm|Dilworth Room

Linear dynamical systems are a continuous subclass of reinforcement learning models that are widely used in robotics, finance, engineering, and meteorology. Classical control, since the works of Kalman, has focused on dynamics with Gaussian i.i.d...

Oct
08
2019

Theoretical Machine Learning Seminar

Unsupervised Ensemble Learning
12:00pm|White-Levy

In various applications, one is given the advice or predictions of several classifiers of unknown reliability, over multiple questions or queries. This scenario is different from standard supervised learning where classifier accuracy can be assessed...

Oct
09
2019

Theoretical Machine Learning Seminar

Designing Fast and Robust Learning Algorithms
12:00pm|Dilworth Room

Most people interact with machine learning systems on a daily basis. Such interactions often happen in strategic environments where people have incentives to manipulate the learning algorithms. As machine learning plays a more prominent role in our...

Oct
23
2019

Theoretical Machine Learning Seminar

Optimization Landscape and Two-Layer Neural Networks
12:00pm|Dilworth Room

Modern machine learning often optimizes a nonconvex objective using simple algorithm such as gradient descent. One way of explaining the success of such simple algorithms is by analyzing the optimization landscape and show that all local minima are...

Nov
12
2019

Theoretical Machine Learning Seminar

Fast IRLS Algorithms for p-norm regression
12:00pm|White-Levy

Linear regression in L_p-norm is a canonical optimization problem that arises in several applications, including sparse recovery, semi-supervised learning, and signal processing. Standard linear regression corresponds to p=2, and p=1 or infinity is...

Nov
13
2019

Theoretical Machine Learning Seminar

Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity
12:00pm|Dilworth Room

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the...

Nov
20
2019

Theoretical Machine Learning Seminar

Nonconvex Minimax Optimization
12:00pm|Dilworth Room

Minimax optimization, especially in its general nonconvex formulation, has found extensive applications in modern machine learning, in settings such as generative adversarial networks (GANs) and adversarial training. It brings a series of unique...

Nov
26
2019

Theoretical Machine Learning Seminar

A Fourier Analysis Perspective of Training Dynamics of Deep Neural Networks
11:30am|White-Levy

This talk focuses on a general phenomenon of "Frequency-Principle" that DNNs often fit target functions from low to high frequencies during the training. I will present empirical evidences on real datasets and deep networks of different settings as...

Dec
04
2019

Theoretical Machine Learning Seminar

Uncoupled isotonic regression
12:00pm|Dilworth Room

The classical regression problem seeks to estimate a function f on the basis of independent pairs $(x_i,y_i)$ where $\mathbb E[y_i]=f(x_i)$, $i=1,\dotsc,n$. In this talk, we consider statistical and computational aspects of the "uncoupled" version...

Dec
17
2019

Theoretical Machine Learning Seminar

How will we do mathematics in 2030 ?
Michael R. Douglas
12:00pm|White-Levy

We make the case that over the coming decade, computer assisted reasoning will become far more widely used in the mathematical sciences. This includes interactive and automatic theorem verification, symbolic algebra, and emerging technologies such...

Dec
18
2019

Theoretical Machine Learning Seminar

Online Learning in Reactive Environments
12:00pm|Dilworth Room

Online learning is a popular framework for sequential prediction problems. The standard approach to analyzing an algorithm's (learner's) performance in online learning is in terms of its empirical regret defined to be the excess loss suffered by the...

Jan
16
2020

Theoretical Machine Learning Seminar

Foundations of Intelligent Systems with (Deep) Function Approximators
Simon Du
12:00pm|Dilworth Room

Function approximators, like deep neural networks, play a crucial role in building machine-learning based intelligent systems. This talk covers three core problems of function approximators: understanding function approximators, designing new...

Jan
21
2020

Theoretical Machine Learning Seminar

The Blessings of Multiple Causes
David M. Blei
12:00pm|Dilworth Room

Causal inference from observational data is a vital problem, but it comes with strong assumptions. Most methods require that we observe all confounders, variables that affect both the causal variables and the outcome variables. But whether we have...

Jan
28
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|Dilworth Room

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...

Feb
04
2020

Theoretical Machine Learning Seminar

Algorithm and Hardness for Kernel Matrices in Numerical Linear Algebra and Machine Learning
12:00pm|Dilworth Room

For a function K : R^d x R^d -> R, and a set P = {x_1, ..., x_n} in d-dimension, the K graph G_P of P is the complete graph on n nodes where the weight between nodes i and j is given by K(x_i, x_j). In this paper, we initiate the study of when...

Feb
11
2020

Theoretical Machine Learning Seminar

Geometric Insights into the convergence of Non-linear TD Learning
12:00pm|Dilworth Room

While there are convergence guarantees for temporal difference (TD) learning when using linear function approximators, the situation for nonlinear models is far less understood, and divergent examples are known. We take a first step towards...

Feb
13
2020

Theoretical Machine Learning Seminar

The Lottery Ticket Hypothesis: On Sparse, Trainable Neural Networks
Jonathan Frankle
12:00pm|Dilworth Room

We recently proposed the "Lottery Ticket Hypothesis," which conjectures that the dense neural networks we typically train have much smaller subnetworks capable of training in isolation to the same accuracy starting from the original initialization...

Feb
20
2020

Theoretical Machine Learning Seminar

Geometric deep learning for functional protein design
Michael Bronstein
12:00pm|Dilworth Room

Protein-based drugs are becoming some of the most important drugs of the XXI century. The typical mechanism of action of these drugs is a strong protein-protein interaction (PPI) between surfaces with complementary geometry and chemistry. Over the...

Feb
25
2020

Theoretical Machine Learning Seminar

Learning from Multiple Biased Sources
Clayton Scott
12:00pm|Dilworth Room

When high-quality labeled training data are unavailable, an alternative is to learn from training sources that are biased in some way. This talk will cover my group’s recent work on three problems where a learner has access to multiple biased...