Research Article
Open access
Published on 27 August 2024
Download pdf
Huang,Y. (2024).An overview of the development and applications of information entropy.Theoretical and Natural Science,50,52-57.
Export citation

An overview of the development and applications of information entropy

Yiqi Huang *,1,
  • 1 WLSA

* Author to whom correspondence should be addressed.

https://doi.org/10.54254/2753-8818/50/20240663

Abstract

With information entropy gradually taking the lead in modern information theory development, it begins to hold greater influence over multiple research areas as well as technology innovation. This paper aims to clarify people’s confusion with the development of entropy theory and provide a brief overview of the origin of entropy theory, including the original Shannon’s proposal, variants such as relative entropy and conditional entropy, and entropy concepts proposed by other scientists, such as Rényi Tsallis entropy. The paper also includes the current application of entropy, studies hotspots, and predicts future entropy development trends. This research paper is able to add more coherence and consistency to information entropy’s development, helping more people to better understand the concept of entropy and its derivation. At the same time, with hotspots of entropy fields of study, this paper hopes to attract more people to devote themselves to studying entropy-related fields, and boost technological development.

Keywords

Information entropy, Shannon entropy, Information theory

[1]. Shannon, Claude E. (1948) A Mathematical Theory of Communication. Bell System Technical Journal. July, 27 (3): 379–423. doi:10.1002/j.1538-7305.1948.tb01338.x.

[2]. Pathria, R. K.; Beale, Paul. (2011) Statistical Mechanics (Third ed.) Academic Press, pp. 51.

[3]. Kullback, S.; Leibler, R.A. (1951) On information and sufficiency. Annals of Mathematical Statistics. 22 (1): 79–86. doi:10.1214/aoms/1177729694.

[4]. Thomas M. Cover; Joy A. Thomas (2006) Elements of Information Theory. Hoboken, New Jersey: Wiley, July 18.

[5]. Rényi, Alfréd. (1961) On measures of information and entropy. Proceedings of the fourth Berkeley Symposium on Mathematics, Statistics and Probability 1960, pp. 547–561.

[6]. Tsallis, C. (1988) Possible generalization of Boltzmann-Gibbs statistics. Journal of Statistical Physics. 52 (1–2): 479–487. doi:10.1007/BF01016429.

[7]. Vazirani, Umesh; Vidick, Thomas (2014). Fully Device-Independent Quantum Key Distribution. Physical Review Letters. September 29, 113 (14): 140501. doi:10.1103/physrevlett.113.140501.

[8]. Aczél, J.; Forte, B.; Ng, C. T. (1974). Why the Shannon and Hartley entropies are 'natural'. Advances in Applied Probability. 6 (1): 131–146. doi:10.2307/1426210.

[9]. Pincus, S. M.; Gladstone, I. M.; Ehrenkranz, R. A. (1991) A regularity statistic for medical data analysis. Journal of Clinical Monitoring and Computing. 7 (4): 335–345. doi:10.1007/BF01619355.

[10]. Huffman, D. (1952) A Method for the Construction of Minimum-Redundancy Codes. Proceedings of the IRE. 40 (9): 1098–1101. doi:10.1109/JRPROC.1952.273898.

[11]. MacKay, David J.C. (2003). Chapter 6: Stream Codes. Information Theory, Inference, and Learning Algorithms. Cambridge University Press, September. Archived from the original (PDF/PostScript/DjVu/LaTeX) on 22 December 2007.

[12]. Gorenstein, D.; Zierler, N. (1961) A class of cyclic linear error-correcting codes in p^m symbols, June. J. SIAM. 9 (2): 207–214. doi:10.1137/0109020.

[13]. Benedetto, Sergio, and Guido Montorsi. (1995) Role of recursive convolutional codes in turbo codes. Electronics Letters, 31.11: 858-859.

[14]. Sarangi, Susanta; Sahidullah, Md; Saha, Goutam (2020) Optimization of data-driven filterbank for automatic speaker verification. Digital Signal Processing, September, 104: 102795. arXiv:2007.10729.

[15]. Alberts B, Johnson A, Lewis J, Raff M, Roberts K, Walters P. (2002) The Shape and Structure of Proteins. Molecular Biology of the Cell; Fourth Edition. New York and London: Garland Science.

[16]. Spellerberg, Ian F., and Peter J. Fedor. (2003) A tribute to Claude Shannon (1916–2001) and a plea for more rigorous use of species richness, species diversity and the “Shannon–Wiener” Index. Global Ecology and Biogeography 12.3, 177-179.

[17]. Helstrom, Carl W. (1976) Quantum detection and estimation theory. New York: Academic Press. ISBN 978-0-12-340050-5. OCLC 316552953.

[18]. Rubinstein, Reuven Y.; Kroese, Dirk P. (2013). The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning. Springer Science & Business Media, March 9.

Cite this article

Huang,Y. (2024).An overview of the development and applications of information entropy.Theoretical and Natural Science,50,52-57.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

Disclaimer/Publisher's Note

The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of EWA Publishing and/or the editor(s). EWA Publishing and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content

About volume

Volume title: Proceedings of the 2nd International Conference on Mathematical Physics and Computational Simulation

Conference website: https://www.confmpcs.org/
ISBN:978-1-83558-613-6(Print) / 978-1-83558-614-3(Online)
Conference date: 9 August 2024
Editor:Anil Fernando
Series: Theoretical and Natural Science
Volume number: Vol.50
ISSN:2753-8818(Print) / 2753-8826(Online)

© 2024 by the author(s). Licensee EWA Publishing, Oxford, UK. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open access policy for details).