Open Access
Subscription Access
Open Access
Subscription Access
Computation of Generalized (h, Φ)-Entropies for Denumerable
Subscribe/Renew Journal
Shannon [1948] adapted to the field of information theory the concept of entropy introduced by Boltzmann and Gibbs in 19th century. Since then many different generalized entropies have been defined to adapt to many different fields. Keeping this in mind, in this paper, some new (h, f)-entropies of random sequences-especially Markov chains, taking values in countable spaces, either finite or denumerable, are suggested. The entropy of the stationary distribution of a Markov chain is the (asymptotic) entropy of the chain at equilibrium; if this distribution is taken as initial distribution of the chain, its entropy is also the marginal entropy of the chain. In both cases, the entropy rate is more representative of the whole trajectory of the sequence. Having the marginal entropy and entropy rate of Markov chains under an explicit form allows one to use them efficiently in all applications involving Markov modeling. When only observations of the chain are available, the need for estimation obviously appears. The case of countable parametric chains, for which transition probabilities are functions of a finite set of parameters, is considered. Further, since the entropy is an explicit function of the transition probabilities and hence, of the parameters, plug-in estimators of the marginal entropy and entropy rate can be obtained by replacing the parameters by their maximum likelihood estimators (MLE). This suggests that there is scope for further research, which can be accomplished by computing and estimating the generalized entropy rates for these computed (h, f)-entropies. Along with this in the last part of this paper, ten new functions are computed using classical entropies, which can further be adapted in the field of entropies for denumerable Markov chains.
Keywords
Markov Chains, Entropy Rate, Renyi’s Entropy, Guiasu’s Entropy, Utility Function.
Subscription
Login to verify subscription
User
Font Size
Information
- Cover, L. and Thomas, J. [1991]: “Elements of Information Theory.” New York: Wiley, series in telecommunications.
- Furuichi, S. [2006]: “Information theoretical properties of Tsaliis entropies.” J. Math. Phys., vol. 47.
- Girardin, V. [2005]: “On the different extensions of the ergodic theorem of information theory.” in Recent Adv. Appl. Probab., R. Baeza-Yates, J. Glaz, H. Gzyl, J. Husler, and J. L. Palacios, Eds. San Francisco: Springer-Verlag, pp. 163–179.
- Rached, Z. Alajaji, F. and Campbell, L. L. [1999]: “Rényi’s entropy rate for discrete Markov sources.”, in Proc. CISS, pp. 613–618.
- Rényi, A. [1960]: “On measures of information and entropy.” In Proc. 4th Berkeley Symp. Mathematics, Statistics and Probability, pp. 547–561.
- Salicrú, M., Menéndez, M. L., Morales, D. and Pardo, L.[1993]: “Asymptotic distribution of (h,f)-entropies,” Comm. Statist. (Theory Ciuperca, G. , Giradin, V. , Lhote, L. [2011]: “Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains”, IEEE Transactions on Information Theory , vol. 57, no. 7, pp. 4026-4034.
- Shannon, C., [1948]: “A mathematical theory of communication,” Bell Syst. Techn. J., Vol. 27, pp. 379-423.
- Sharma, B.D. and Mittal, P. [1975]: “New non-additive measures of relative information,” J. Comb. Inf. Syst. Sci., vol. 2, pp. 122–133.
- Tsallis, C. [1988]: “Possible generalization of Boltzmann-Gibbs statistics.” J. Stat. Phys., vol. 52, pp. 479–487.
Abstract Views: 797
PDF Views: 0