Abstract:
Let $X(k)=X(u,k)$, $k=0,1,\dots$, be a time-homogeneous real-valued ergodic Markov chain with initial value $u\equiv X(u,0)=X(0)$. We study the asymptotic behavior of the crossing probability of a given boundary $g(k)$, $k=0,1,\dots,n$, by a trajectory $X(k)$,
$$
\mathbf{P}\Bigl\{\max_{k\le n}(X(k)-g(k))>0\Bigr\},
$$
where the boundary $g$ depends, generally speaking, on $n$ and a growing parameter $x$ in such a way that $\min_{k\le n}g(k)\to\infty$ as $x\to\infty$. It is assumed that the distributions of the increments $\xi(u)=X(u,1)-u$ of the chain either have regularly varying tails or are majorized by such tails.
Limit theorems are obtained, describing the asymptotic behavior of the probabilities in question, under broad conditions in the domains of large and normal deviations, including theorems valid “uniformly on the real line,” and giving explicit forms of the right-hand sides. Asymptotic properties of the regeneration cycles to a positive atom are investigated for Harris recurrent Markov chains. An analogue of the law of the iterated logarithm is established.
Keywords:Markov chains, large deviations, boundary crossing, heavy tail, law of iterated logarithm, uniform limit theorem.