Abstract:
We discuss the meanings of the term “entropy” used in different fields (statistical physics, information theory, theory of dynamic systems) and application of this notion as characteristic of independence and uniformity of distributions of finite alphabet symbols. The main purpose is to use this example to show that the correct interpretation of results of statistical tests applications should include really used a priori hypotheses on probability models of analyzed sequences. An example of sequences having almost maximal possible entropy but unsatisfactory from the cryptographic viewpoint is given.
Key words:entropy, uniformity of distributions, independence, interpretation of results of statistical tests.