Abstract:
We consider the problem of a posteriori change-point detection
for a sequence of independent
identically distributed random variables. We propose to use $d$-risks instead
of error of the first type and error of the second type.
We construct an asymptotically optimal test minimizing one $d$-risk and
guaranteeing another.
Keywords:change-point detection, hypothesis discrimination, close hypothesis, $d$-a posteriori approach, $d$-optimality, weak convergence, Wiener process functional.