Published: Feb. 26, 2016
Friday, February 26, 2016 3:00 PM - 4:00 PM
Main Campus - Engineering Classroom Wing - 265
,Ìý, University of Colorado Boulder
Ìý

Minimizing the Risk of Risk: Computing Risk Measures via Empirical Risk Minimization

Ìý

An important open question in statistics and machine learning is the characterization of which statistics can be computed by minimizing a loss function over a data set, a procedure known as Empirical Risk Minimization (ERM).Ìý We propose a more practical question: since all statistics can be computed if one includes enough optimization parameters, what is the minimum number of parameters needed to compute the statistic?Ìý Building on previous work, we formalize this in a new notion of complexity, and establish several general results and techniques for proving upper and lower complexity bounds.Ìý These results provide tight bounds for a large class of statistics known as Bayes risks, which includes spectral risk measures, a popular measure of risk in the finance literature.Ìý We conclude with applications to superquantile regression of Rockafellar and Royset, and the risk quadrangles of Rockafellar andÌýUryasev.