Abstract:
Let $\mathbf x[\cdot]$ be a stationary Gaussian process with zero mean and spectral density $f$, $\mathscr F$ be the $\sigma$-algebra, induced by random variables $\mathbf x[\varphi],\,\varphi\in D(R^1)$, $\mathscr F_t$, $t>0$, be the $\sigma$-algebra, induced by random variables $\mathbf x[\varphi],\operatorname{supp}\varphi\in[-t,t]$. We denote by $\mathscr P(f)$ the Gaussian measure on $\mathscr F$, generated by $\mathbf x$. Let $\mathscr P_t(f)$ be the restriction of $\mathscr P(f)$ on $\mathscr F_t$. Suppose nonnegative functions $f$ and $g$ are chosen by such a way that measures $\mathscr P_t(f)$ and $\mathscr P_t(g)$ are
absolutely continuous and put
$$
\mathscr D_t(f,g)=\ln\frac{d\mathscr P_t(f)}{d\mathscr P_t(g)}\,.
$$
For a fixed $g(u)$ and $f(u)=f_t(u)$ close in some sense to $g(u)$ the asymptotic normality of $\mathscr D_t(f,g)$ is proved under some regularity conditions.