Abstract:
In the paper iterations $J_{m+1}=J_m-\varepsilon J_mL_{S_m}J_m$, $m=0,1,2,\ldots$; $\varepsilon>0$ are considered. $J_m$ and $L_{S_m}$ are selfadjoint operators on $\mathbb R^N$, $L_{S_m}=(\cdot,S_m)S_m$, with $S_m$ being independent identically distributed random vectors which satisfy some additional conditions. Initial opetator $J_0$ is nonrandom. Asymptotic behavior of the rescaled operator $\tilde{J_m}=\|J_m\|^{-1}J_m$ is examined. Problems of this type appear in neural network theory when studying REM sleep phenomenon. It is proven that one of the following three relations holds almost surely: I. $\lim_{m\to\infty}\tilde{J}_m=P_{\mathcal L}$; II. $\lim_{m\to\infty}\tilde{J}_m=-P_{\xi}$; III. $J_m=0$ starting from some $m_0$; here $P_{\mathcal L}$ and $P_{\xi}$ are orthogonal projectors on random subspace $\mathcal L\subset\mathbb R^N$ and one-dimensional subspace spanned by random nonzero vector $\xi$, respectively. Denote $P_+(\varepsilon)$ and $P_-(\varepsilon)$ the probabilities of asymptotic behaviors I and II. For $J_0$ being nonzero positive semidefinite it is shown that $\lim_{\varepsilon\to+0}P_+(\varepsilon)=1$, $\lim_{\varepsilon\to+\infty}P_-(\varepsilon)=1$, but if $J_0$ has at least one negative eigenvalue, then $P_-(\varepsilon)\equiv1$.