Abstract:
Non-asymptotic estimates are given of the mean rate of convergence in the functional for stochastic Robbins-Monro and Keefer-Wolfowitz algorithms and for random search with statistical gradient and pairwise. Optimal algorithm parameters are established which ensure the fastest decrease of estimates as $n\to\infty$. The functions to be minimized include convex and those with exponential degeneration.