|
ВИДЕОТЕКА |
|
[Stochastic online gradient-free method with inexact oracle and huge-scale optimization] А. В. Гасников Московский физико-технический институт (государственный университет), г. Долгопрудный Московской обл. |
|||
Аннотация: In the talk we generalize results by Nesterov Yu. Random gradient-free minimization of convex functions. CORE Discussion Paper 2011/1. 2011. First of all we consider online (two-point) case. Moreover we consider general prox-case, that is we don't resrtrict ourself euclidian structure. But, the main ingridient of our generalization is assumption about inexact zero order oracle (this oracle return the value of the function). And the nature of inexactness has not only stochastic nature, but also determinated part. This guy leads to the bias in the stochastic gradient approximation due to the finite difference. The unpredictable and rather unexpected result is that we have the same estimation the complexity of properly modified mirror descent algorithm for inexact oracle with level of noise is no more than desirable accuracy (on function). All the results obtained in terms of probabilities of large deviations. Moreover, proposed algorithms reach unimprovable estimotion of complexity to within a multiplicative constant. This algorithms can be used in special class of huge-scale optimization problem. One of such problems (comes from Yandex) we plane to describe briefly if we have enough time. Joint work with Lagunovskaia Anastasia. Язык доклада: английский |