RUS  ENG
Full version
JOURNALS // Vestnik of Astrakhan State Technical University. Series: Management, Computer Sciences and Informatics // Archive

Vestn. Astrakhan State Technical Univ. Ser. Management, Computer Sciences and Informatics, 2014 Number 4, Pages 67–72 (Mi vagtu345)

COMPUTER SOFTWARE AND COMPUTING EQUIPMENT

Ratios for alternative determination of the information amount at various distribution laws

A. L. Kozlov, E. P. Selivanov

,

Abstract: The article discusses the shortcomings of the differential Shannon entropy and proposes an alternative measure of information, which does not have these disadvantages, quantitatively expressed through the integral Lebesgue–Stieltjes and exists for mathematical models as continuous random variables (CRV) and for discrete random variables (DRV) – the amount of information Q. Its mathematical description is given, potential advantages of the proposed measures information before the Shannon entropy are justified. Under the identification problem of the distribution law of a random variable, as a rule, the task of choosing a parametric model of the probability distribution, which best fits to the experimental results, is understood. Measurement errors as values are influenced by many factors, random and non-random origin acting continuously or episodically. However, the true distribution law describing the uncertainty of the particular measurement system remain unknown, in spite of all the attempts to identify it. On the basis of the measured data and theoretical considerations you can just pick up a probabilistic model, which in some sense makes this law best approximated. If the designed model is adequate, that is the used criteria do not give grounds for its rejection, on the basis of this model, we can calculate all the probabilistic characteristics of the random component of the error measuring device, which will differ from the true values only at the expense of non-excluded systematic (unobserved or unrecorded) component of the measurement error. The amount of information mini-mizes the error in the solution of the problems of identification of the experimental distribution laws DRV or CRV. The proposed measure does not depend on numerical characteristics of DRV or CRV – mathematical expectation, variance, correlation moments, incidents, variations, etc. The results of the calculations of the amount of information for different distribution laws DRV and CRV are given. As an example, the calculation of the amount of information for the discrete Poisson law is considered.

Keywords: amount of information, discrete random variables, continuous random variables, entropy, identification of the distribution law.

UDC: 519.722/.723:519.224

Received: 21.08.2014
Revised: 10.10.2014



© Steklov Math. Inst. of RAS, 2024