Abstract:
We consider the estimation problem for an unknown vector $\beta\in\mathbb R^p$ in a linear model $Y=X\beta+\sigma\xi$, where $\xi\in\mathbb R^n$ is a standard discrete white Gaussian noise and $X$ is a known $n\times p$ matrix with $n\ge p$. It is assumed that $p$ is large and $X$ is an ill-conditioned matrix. To estimate $\beta$ in this situation, we use a family of spectral regularizations of the maximum likelihood method $\widetilde\beta^\alpha(Y)= H^\alpha(X^\top X)\widehat\beta^\circ(Y)$, $\alpha\in\mathbb R^+$, where $\widehat\beta^\circ(Y)$ is the maximum likelihood estimate for $\beta$ and $\{H^\alpha(\cdot)\colon\mathbb R^+\to[0,1],\ \alpha\in\mathbb R^+\}$ is a given ordered family of functions indexed by a regularization parameter $\alpha$. The final estimate for $\beta$ is constructed as a convex combination (in $\alpha$) of the estimates $\widetilde\beta^\alpha(Y)$ with weights chosen based on the observations $Y$. We present inequalities for large deviations of the norm of the prediction error of this method.