Seminars Sorted by Series

Theoretical Machine Learning Seminar

Mar
11
2019

Theoretical Machine Learning Seminar

A Theoretical Analysis of Contrastive Unsupervised Representation Learning
Orestis Plevrakis
1:15pm|White Levy Room

Recent empirical works have successfully used unlabeled data to learn feature representations that are broadly useful in downstream classification tasks. Several of these methods are reminiscent of the well-known word2vec embedding algorithm...

Apr
08
2019

Theoretical Machine Learning Seminar

Fast minimization of structured convex quartics
Brian Bullins
12:15pm|White Levy Room

Recent progress in optimization theory has shown how we may harness second-order, i.e. Hessian, information, for achieving faster rates for both convex and non-convex optimization problems. Given this observation, it is then natural to ask what sort...

Oct
02
2019

Theoretical Machine Learning Seminar

Rethinking Control
Elad Hazan
12:00pm|Dilworth Room

Linear dynamical systems are a continuous subclass of reinforcement learning models that are widely used in robotics, finance, engineering, and meteorology. Classical control, since the works of Kalman, has focused on dynamics with Gaussian i.i.d...

Oct
08
2019

Theoretical Machine Learning Seminar

Unsupervised Ensemble Learning
12:00pm|White-Levy

In various applications, one is given the advice or predictions of several classifiers of unknown reliability, over multiple questions or queries. This scenario is different from standard supervised learning where classifier accuracy can be assessed...

Oct
09
2019

Theoretical Machine Learning Seminar

Designing Fast and Robust Learning Algorithms
12:00pm|Dilworth Room

Most people interact with machine learning systems on a daily basis. Such interactions often happen in strategic environments where people have incentives to manipulate the learning algorithms. As machine learning plays a more prominent role in our...

Oct
23
2019

Theoretical Machine Learning Seminar

Optimization Landscape and Two-Layer Neural Networks
12:00pm|Dilworth Room

Modern machine learning often optimizes a nonconvex objective using simple algorithm such as gradient descent. One way of explaining the success of such simple algorithms is by analyzing the optimization landscape and show that all local minima are...

Nov
12
2019

Theoretical Machine Learning Seminar

Fast IRLS Algorithms for p-norm regression
12:00pm|White-Levy

Linear regression in L_p-norm is a canonical optimization problem that arises in several applications, including sparse recovery, semi-supervised learning, and signal processing. Standard linear regression corresponds to p=2, and p=1 or infinity is...

Nov
13
2019

Theoretical Machine Learning Seminar

Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity
12:00pm|Dilworth Room

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the...

Nov
20
2019

Theoretical Machine Learning Seminar

Nonconvex Minimax Optimization
12:00pm|Dilworth Room

Minimax optimization, especially in its general nonconvex formulation, has found extensive applications in modern machine learning, in settings such as generative adversarial networks (GANs) and adversarial training. It brings a series of unique...

Nov
26
2019

Theoretical Machine Learning Seminar

A Fourier Analysis Perspective of Training Dynamics of Deep Neural Networks
11:30am|White-Levy

This talk focuses on a general phenomenon of "Frequency-Principle" that DNNs often fit target functions from low to high frequencies during the training. I will present empirical evidences on real datasets and deep networks of different settings as...

Dec
04
2019

Theoretical Machine Learning Seminar

Uncoupled isotonic regression
12:00pm|Dilworth Room

The classical regression problem seeks to estimate a function f on the basis of independent pairs $(x_i,y_i)$ where $\mathbb E[y_i]=f(x_i)$, $i=1,\dotsc,n$. In this talk, we consider statistical and computational aspects of the "uncoupled" version...

Dec
17
2019

Theoretical Machine Learning Seminar

How will we do mathematics in 2030 ?
Michael R. Douglas
12:00pm|White-Levy

We make the case that over the coming decade, computer assisted reasoning will become far more widely used in the mathematical sciences. This includes interactive and automatic theorem verification, symbolic algebra, and emerging technologies such...

Dec
18
2019

Theoretical Machine Learning Seminar

Online Learning in Reactive Environments
12:00pm|Dilworth Room

Online learning is a popular framework for sequential prediction problems. The standard approach to analyzing an algorithm's (learner's) performance in online learning is in terms of its empirical regret defined to be the excess loss suffered by the...

Jan
16
2020

Theoretical Machine Learning Seminar

Foundations of Intelligent Systems with (Deep) Function Approximators
Simon Du
12:00pm|Dilworth Room

Function approximators, like deep neural networks, play a crucial role in building machine-learning based intelligent systems. This talk covers three core problems of function approximators: understanding function approximators, designing new...

Jan
21
2020

Theoretical Machine Learning Seminar

The Blessings of Multiple Causes
David M. Blei
12:00pm|Dilworth Room

Causal inference from observational data is a vital problem, but it comes with strong assumptions. Most methods require that we observe all confounders, variables that affect both the causal variables and the outcome variables. But whether we have...

Jan
28
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|Dilworth Room

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...

Feb
04
2020

Theoretical Machine Learning Seminar

Algorithm and Hardness for Kernel Matrices in Numerical Linear Algebra and Machine Learning
12:00pm|Dilworth Room

For a function K : R^d x R^d -> R, and a set P = {x_1, ..., x_n} in d-dimension, the K graph G_P of P is the complete graph on n nodes where the weight between nodes i and j is given by K(x_i, x_j). In this paper, we initiate the study of when...

Feb
11
2020

Theoretical Machine Learning Seminar

Geometric Insights into the convergence of Non-linear TD Learning
12:00pm|Dilworth Room

While there are convergence guarantees for temporal difference (TD) learning when using linear function approximators, the situation for nonlinear models is far less understood, and divergent examples are known. We take a first step towards...

Feb
13
2020

Theoretical Machine Learning Seminar

The Lottery Ticket Hypothesis: On Sparse, Trainable Neural Networks
Jonathan Frankle
12:00pm|Dilworth Room

We recently proposed the "Lottery Ticket Hypothesis," which conjectures that the dense neural networks we typically train have much smaller subnetworks capable of training in isolation to the same accuracy starting from the original initialization...

Feb
20
2020

Theoretical Machine Learning Seminar

Geometric deep learning for functional protein design
Michael Bronstein
12:00pm|Dilworth Room

Protein-based drugs are becoming some of the most important drugs of the XXI century. The typical mechanism of action of these drugs is a strong protein-protein interaction (PPI) between surfaces with complementary geometry and chemistry. Over the...

Feb
25
2020

Theoretical Machine Learning Seminar

Learning from Multiple Biased Sources
Clayton Scott
12:00pm|Dilworth Room

When high-quality labeled training data are unavailable, an alternative is to learn from training sources that are biased in some way. This talk will cover my group’s recent work on three problems where a learner has access to multiple biased...

Feb
27
2020

Theoretical Machine Learning Seminar

Preference Modeling with Context-Dependent Salient Features
12:00pm|Dilworth Room

This talk considers the preference modeling problem and addresses the fact that pairwise comparison data often reflects irrational choice, e.g. intransitivity. Our key observation is that two items compared in isolation from other items may be...

Mar
03
2020

Theoretical Machine Learning Seminar

What Noisy Convex Quadratics Tell Us about Neural Net Training
12:00pm|White-Levy

I’ll discuss the Noisy Quadratic Model, the toy problem of minimizing a convex quadratic function with noisy gradient observations. While the NQM is simple enough to have closed-form dynamics for a variety of optimizers, it gives a surprising amount...

Mar
05
2020

Theoretical Machine Learning Seminar

Understanding Deep Neural Networks: From Generalization to Interpretability
Gitta Kutyniok
12:00pm|Dilworth Room

Deep neural networks have recently seen an impressive comeback with applications both in the public sector and the sciences. However, despite their outstanding success, a comprehensive theoretical foundation of deep neural networks is still missing...

Mar
10
2020

Theoretical Machine Learning Seminar

Your Brain on Energy-Based Models: Applying and Scaling EBMs to Problems of Interest to the Machine Learning Community Today
Will Grathwohl
12:00pm|Dilworth Room

In this talk, I will discuss my two recent works on Energy-Based Models. In the first work, I discuss how we can reinterpret standard classification architectures as class conditional energy-based models and train them using recently proposed...

Mar
11
2020

Theoretical Machine Learning Seminar

Improved Bounds on Minimax Regret under Logarithmic Loss via Self-Concordance
Blair Bilodaeu
4:00pm|Simonyi 101

We study sequential probabilistic prediction on data sequences which are not i.i.d., and even potentially generated by an adversary. At each round, the player assigns a probability distribution to possible outcomes and incurs the log-likelihood of...

Mar
26
2020

Theoretical Machine Learning Seminar

Margins, perceptrons, and deep networks
Matus Telgarsky
12:00pm|https://illinois.zoom.us/j/741628827

This talk surveys the role of margins in the analysis of deep networks. As a concrete highlight, it sketches a perceptron-based analysis establishing that shallow ReLU networks can achieve small test error even when they are quite narrow, sometimes...

Mar
31
2020

Theoretical Machine Learning Seminar

Some Recent Insights on Transfer Learning
12:00pm|https://theias.zoom.us/j/384099138

A common situation in Machine Learning is one where training data is not fully representative of a target population due to bias in the sampling mechanism or high costs in sampling the target population; in such situations, we aim to ’transfer’...

Apr
02
2020

Theoretical Machine Learning Seminar

Learning Controllable Representations
12:00pm|https://theias.zoom.us/j/384099138

As deep learning systems become more prevalent in real-world applications it is essential to allow users to exert more control over the system. Exerting some structure over the learned representations enables users to manipulate, interpret, and even...

Apr
07
2020

Theoretical Machine Learning Seminar

Interpolation in learning: steps towards understanding when overparameterization is harmless, when it helps, and when it causes harm
Anant Sahai
12:00pm|https://theias.zoom.us/j/384099138

A continuing mystery in understanding the empirical success of deep neural networks has been in their ability to achieve zero training error and yet generalize well, even when the training data is noisy and there are many more parameters than data...

Apr
09
2020

Theoretical Machine Learning Seminar

Meta-Learning: Why It’s Hard and What We Can Do
3:00pm|https://theias.zoom.us/j/384099138

Meta-learning (or learning to learn) studies how to use machine learning to design machine learning methods themselves. We consider an optimization-based formulation of meta-learning that learns to design an optimization algorithm automatically...

Apr
21
2020

Theoretical Machine Learning Seminar

Assumption-free prediction intervals for black-box regression algorithms
Aaditya Ramdas
12:00pm|https://theias.zoom.us/j/384099138

There has been tremendous progress in designing accurate black-box prediction methods (boosting, random forests, bagging, neural nets, etc.) but for deployment in the real world, it is useful to quantify uncertainty beyond making point-predictions...

Apr
23
2020

Theoretical Machine Learning Seminar

Deep Generative models and Inverse Problems
Alexandros Dimakis
3:00pm|https://theias.zoom.us/j/384099138

Modern deep generative models like GANs, VAEs and invertible flows are showing amazing results on modeling high-dimensional distributions, especially for images. We will show how they can be used to solve inverse problems by generalizing compressed...

Apr
30
2020

Theoretical Machine Learning Seminar

Latent Stochastic Differential Equations for Irregularly-Sampled Time Series
David Duvenaud
3:00pm|Remote Access Only - see link below

Much real-world data is sampled at irregular intervals, but most time series models require regularly-sampled data. Continuous-time models address this problem, but until now only deterministic (ODE) models or linear-Gaussian models were efficiently...

May
05
2020

Theoretical Machine Learning Seminar

Boosting Simple Learners
12:00pm|Remote Access Only - see link below

We study boosting algorithms under the assumption that the given weak learner outputs hypotheses from a class of bounded capacity. This assumption is inspired by the common convention that weak hypotheses are “rules-of-thumbs” from an “easy-to-learn...

May
07
2020

Theoretical Machine Learning Seminar

Learning probability distributions; What can, What can't be done
Shai Ben-David
3:00pm|Remote Access Only - see link below

A possible high level description of statistical learning is that it aims to learn about some unknown probability distribution ("environment”) from samples it generates ("training data”). In its most general form, assuming no prior knowledge and...

May
12
2020

Theoretical Machine Learning Seminar

Generative Modeling by Estimating Gradients of the Data Distribution
Stefano Ermon
12:00pm|Remote Access Only - see link below

Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the...

May
14
2020

Theoretical Machine Learning Seminar

MathZero, The Classification Problem, and Set-Theoretic Type Theory
David McAllester
3:00pm|Remote Access Only - see link below

AlphaZero learns to play go, chess and shogi at a superhuman level through self play given only the rules of the game. This raises the question of whether a similar thing could be done for mathematics --- a MathZero. MathZero would require a formal...