Theoretical Machine Learning Seminar
Fast minimization of structured convex quartics
Recent progress in optimization theory has shown how we may harness second-order, i.e. Hessian, information, for achieving faster rates for both convex and non-convex optimization problems. Given this observation, it is then natural to ask what sort of advantage, if any, can be obtained by moving to even higher-order derivative information. In this talk, I will present the algorithm FastQuartic, which combines even higher-order information, as part of an efficient tensor method, with “highly smooth acceleration” to guarantee even faster convergence for l_4-regression, and more generally for a large class of convex quartic optimization problems.
Date & Time
April 08, 2019 | 12:15pm – 1:45pm
Location
White Levy RoomSpeakers
Brian Bullins
Affiliation
Princeton University