RUS  ENG
Полная версия
ЖУРНАЛЫ // Theory of Stochastic Processes // Архив

Theory Stoch. Process., 2020, том 25(41), выпуск 2, страницы 25–36 (Mi thsp316)

Asymptotics of error probabilities of optimal tests

V. Kanišauskas, K. Kanišauskienè

Institute of Regional Development, Vilnius University Šiauliai Academy, P. Višinskio st. 25, LT-76351 Šiauliai, Lithuania

Аннотация: We consider first and second error probabilities of asymptotically optimal tests (Neyman-Pearson, minimax, Bayesian) when two simple hypotheses $H^{t}_{1}$ and $H^{t}_{2}$ parametrized by time $t \ge 0$ are tested under the observation $X^t$ of arbitrary nature. The paper provides details on the conditions of asymptotic decrease of probabilities of optimal criteria errors determined by $\alpha$ type Hellinger integral between measures $P^{t}_{1}$ and $P^{t}_{2}$, demonstrating that in the case of minimax and Bayesian criteria it is sufficient to investigate Hellinger integral, when $\alpha \in \left( 0, 1 \right)$, and in the case of Neyman-Pearson criterion it is observed only in the environment of point $\alpha=1$. Whereas Kullback-Leibler information distance is always larger than Chernoff distance; we discover that, in the case of Neyman-Pearson criterion, the probability of type II error decreases faster than in the cases of minimax or Bayesian criteria. This is proven by the examples of marked point processes of the i.i.d. case, non-homogeneous Poisson process and the geometric renewal process presented at the end of the paper.

Ключевые слова: Hellinger integral, Neyman-Pearson test, minimax test, Bayesian test, Kullback-Leibler information, Chernoff information.

MSC: 62F05, 60G55

Язык публикации: английский



© МИАН, 2024