|
SEMINARS |
Seminar on Probability Theory and Mathematical Statistics
|
|||
|
Nonasymptotic Analysis of Stochastic Gradient Descent with the Richardson–Romberg Extrapolation S. V. Samsonov |
|||
Abstract: We address the problem of solving strongly convex and smooth minimization problems using stochastic gradient descent (SGD) algorithm with a constant step size. Previous works suggested to combine the Polyak-Ruppert averaging procedure with the Richardson-Romberg extrapolation technique to reduce the asymptotic bias of SGD at the expense of a mild increase of the variance. We significantly extend previous results by providing an expansion of the mean-squared error of the resulting estimator with respect to the number of iterations Based on the joint work https://arxiv.org/abs/2410.05106 |