Abstract:
The paper is concerned with asymptotical properties of a scalar stochastic approximation procedure with dependent noise which contaminates observations of the gradient of the function to be optimized. Sufficient conditions are established for convergence almost surely and for mean square convergence and conditions are provided which ensure asymptotic normality. The rate of convergence is computed. Examples of different noise dependences are studied.