Abstract:
New inequalities are proved for the variance of the Pitman estimators (minimum variance equivariant estimators) of $\theta$ constructed from samples of fixed size from populations $F(x-\theta)$. The inequalities are closely related to the classical Stam inequality for the Fisher information, its analog in small samples, and a powerful variance drop inequality. The only condition required is finite variance of $F$; even the absolute continuity of $F$ is not assumed. As corollaries of the main inequalities for small samples, one obtains alternate proofs of known properties of the Fisher information, as well as interesting new observations like the fact that the variance of the Pitman estimator based on a sample of size $n$ scaled by $n$ monotonically decreases in $n$. Extensions of the results to the polynomial versions of the Pitman estimators and a multivariate location parameter are given. Also, the search for characterization of equality conditions for one of the inequalities leads to a Cauchy-type functional equation for independent random variables, and an interesting new behavior of its solutions is described.
Key words and phrases:Fisher information, location parameter, monotonicity of the variance, Stam inequality.