Abstract:
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probability distribution and construct a generalized equilibrium statistical mechanics (ESM) for a wide class of objects to which the usual (canonical) ESM cannot be applied. We consistently consider the case of a continuous, not discrete, random variable characterizing the state of the object. For large values of the argument, the resulting distribution is characterized by a power-law, not exponential, asymptotic behavior, and the corresponding power asymptotic expression agrees with the empirical laws established for these objects. The $\varepsilon$-deformed Boltzmann–Gibbs–Shannon functional satisfying the requirements of the entropy axiomatics and leading to the canonical ESM for $\varepsilon=0$ is used as the original entropy functional. We also consider nonlinear transformations of this functional. We show that depending on how the averages of the dynamical characteristics of the object are defined, the different (Tsallis, Renyi, and Hardy–Littlewood–Pólya) versions of the generalized ESM can be used, and we give their comparative analysis. We find conditions under which the Gibbs–Helmholtz thermodynamic relations hold and the Legendre transformation can be applied to the generalized entropy and the Massieu–Planck function. We consider the Tsallis and Renyi ESM versions in detail for the case of a one-dimensional probabilistic object with a single dynamical characteristic whose role is played by a generalized positive “energy” with a monotonic power growth. We obtain constraints on the Renyi index under which the equilibrium distribution relates to a definite class of stable Gaussian or Levy–Khinchin distributions.