RUS  ENG
Full version
VIDEO LIBRARY



Empirical entropy, minimax regret and minimax risk

A. Tsybakov

CREST-ENSAE


https://www.youtube.com/watch?v=pTsxFVx2oaY

Abstract: The study of the minimax risk over a class of functions and of a minimax regret based on the excess risk represents two parallel developments; the former has been mostly analyzed within Nonparametric Statistics, while the second — within Statistical Learning Theory. This talk aims to bring out a connection between these two objects. Considering the random design regression with square loss, we propose a method that aggregates empirical minimizers over appropriately chosen random subsets and we establish sharp oracle inequalities for its risk. We show that, under the $\epsilon^{-p}$ growth of the empirical $\epsilon$-entropy, the excess risk of the proposed method attains the rate $n^{-\frac{2}{2+p}}$ for $p\in(0;2]$ and $n^{-\frac{1}{p}}$ for $p > 2$. This yields a conclusion that the rates of estimation in well-specified models (minimax risk) and in misspecified models (minimax regret) are equivalent in the regime $p \in (0;2]$. In other words, for $p \in (0;2]$ the problem of statistical learning enjoys the same minimax rate as the problem of statistical estimation. Our oracle inequalities also imply the $\log(n)/n$ rates for Vapnik-Chervonenkis type classes without the usual convexity assumption on the class; we show that these rates are optimal. As another corollary we obtain optimal rates of s-sparse convex aggregation. Finally, we introduce a more general risk measure that realizes a smooth transition between the minimax risk and the minimax regret depending on the magnitude of the approximation error. The minimax risk and the minimax regret appear as the two extremities of this scale. We provide sharp oracle inequalities for this new risk measure. [Joint work with Alexander Rakhlin and Karthik Sridharan]

Language: English


© Steklov Math. Inst. of RAS, 2024