Abstract:
The paper proposes a generalization of entropy as in [1]. At first, to constract the estimator, we select the metrics on the space of sequances. This metrics is based on a matrix that can be interpreted as an edge coloring of a complete graph with loops. A generalization consists in that instead of using the logarithm in the estimation of the entropy, we apply a similar function which may be arbitrary at the given range. The proposed function is not monotone, so the task of optimizing the average deviation which is a quadratic optimization problem, is solved in the whole space and not on the simplex. The main properties of the estimator, such as asymptotic unbiasedness and power decrease dispersion, are proved in a similar way.