On a consistency of orthogonal series estimators with respect to Jacobi polynomials system
V. V. Novikov,
A. Hudoshina Saratov State University
Abstract:
Consider a nonparametric regression model
$
Y_{i} =m(X_{i} )+\varepsilon _{i} \, ,\, i=1,\dots,n,
$
where
$m(x)$ is the unknown regression function to be estimated, $\left\{\left(X_{i} ,Y_{i} \right)\right\}_{i=1}^{n} $ is a dataset and
$\{ \varepsilon _{i} \} _{i=1}^{n} $ are observation errors.
Suppose that the regression function can be represented as a Fourier series
$
m\left(x\right)=\sum _{j=0}^{\infty }\beta _{j} \varphi _{j} \left(x\right),
$
where the system of functions $\left\{\, \varphi _{j} \left(x\right)\right\}_{j=0}^{\, \infty } $ constitutes an orthonormal basis on
$[-1,1]$, with respect to inner product
$
\left(f ,g \right)=\int _{-1}^{1}f\left(x\right)g\left(x\right)\rho(x)\, dx,
$
and
$\{\beta _{j}\}$ are Fourier coeffcients. Next assume that observations
$\{ Y _{i} \} _{i=1}^{n} $ have been taken at equidistant points
$\{ X _{i} \} _{i=1}^{n} $ over the interval
$[-1, 1]$ and let
$\left\{A_{i} \right\}_{i=1}^{n} $ be a set of disjoint intervals such that
$\cup_{i=1}^n A_{i}=[-1,1]$ and
$X_{i} \in A_{i} $,
$i=1,...,n$. Put
$
\hat{m}_{N(n)} \left(x\right)=\sum _{j=0}^{N\left(n\right)}\hat{\beta }_{j} \varphi _{j} \left(x\right), \;\hat{\beta }_{j} =\sum _{i=1}^{n}Y_{i} \int _{A_{i} }\varphi _{j} \left(x\right)\rho(x)\,dx,
$
where
$N(n)$ is a suitable finite number. This estimator is called an orthogonal series estimator of
$m(x)$.
In the present paper, we give the consistency condition for
$\hat{m}(x)$
provided that the regression function
$m(x)$ is Lipschitz continuous and
$
\varphi _{j} \left(x\right)= P_{j}^{\, (\alpha ,\beta )} (x),\,j=0,1,\dots,
$
is the Jacobi orthonormal polynomials system with certain restrictions for exponents
$\alpha,\, \beta $. The main result is as follows.
Theorem 1. Suppose that the following conditions are satisfied:
i) $\mathsf{E}\varepsilon _{i} =0$,
$\mathsf{E}(\varepsilon _{i}\varepsilon _{j}) =0$,
$i\neq j$,
and $\mathsf{E}\varepsilon _{i}^2 <C$,
$i=1,\dots,n$;
ii)
$
m(\cdot) \in \mathrm{Lip}_M1;
$
iii)
$
p:=\min\{\alpha;\beta\}\ge-1/2;
$
iv)
$
(N(n))^2=o\left\{A_n(\alpha;\beta)\right\},\;n\to\infty,
$
where
$A_n(\alpha,\beta)=n$, if
$p>-1/2$ and
$A_n(\alpha,\beta)=n/\log n$, if
$p=-1/2$.
Then $\hat{m}_{N} \left(x\right)\stackrel{p}{\longrightarrow} m\left(x\right), \;N\left(n\right)\to \infty$ for every
$x\in(-1,1)$.
Theorem 2.
Let the conditions {i)–iii)} of previous theorem are satisfied,
$q=\max\{\alpha;\beta\}<1/2$, and
$
\left(N(n)\right)^{2q+3}=o\left\{A_n(\alpha,\beta)\right\},\;n\to\infty.
$
Then $\hat{m}_{N} \left(x\right)\stackrel{p}{\longrightarrow} m\left(x\right)$,
$N\left(n\right)\to \infty$, for every
$x\in[-1,1]$.
Keywords:
nonparametric regression, consistency, estimator, orthogonal series, Jacobi polynomials.
UDC:
519.23
MSC: 62G08