Shannon entropy


Also found in: Wikipedia.

Shannon entropy


[After Claude Elwood Shannon.]
Mentioned in ?
References in periodicals archive ?
Although some non-linear measurements used in autonomic modulation (AM) analysis can be suitable using short-term series, they usually depend on long-term series of data, as was the case of Shannon Entropy (5) and correlation dimension (6).
[60, 61, 67] proposed that Renyi entropies may also be used as a tool for studying the dynamical systems and are closely related to the thermodynamic entropy of the system, the Shannon entropy. Bialas et al.
The Shannon entropy [21] of a random variable X, say [[eta].sub.X] = E(- log f(X)).
For instance, Shannon entropy (SE) is commonly used for biomedical complexity analysis of EEG and ECG recordings [3-5].
The decrease in Shannon entropy is attributed to an increase in the information content and order and, equivalently, to a decrease in complexity.
[11] used a theoretical method based on maximum Shannon entropy framework to study the finite buffer system, and the advantage of the method is that it has enabled one to derive the analytical closed form generalized expression of the probability distribution of queue size in finite buffer system.
Information entropy, also known as Shannon entropy, is proposed by Shannon to solve the problem of quantitative measurement of information.
Shannon entropy has many applications in theory of communications.
The suggestion that entropy gradients called "Shannon entropy" (Shannon, 1948) could be correlated with the strength of psi is a fascinating testable hypothesis.
Urban expansion assessment by using remotely sensed data and the relative Shannon entropy model in GIS: a case study of Tripoli, Libya, Theoretical and Empirical Researches in Urban Management 10(1): 55-71.
The first attempt to quantify the fuzziness was made in 1968 by Zadeh [22], who introduced a probabilistic framework and defined the entropy of a fuzzy event as weighted Shannon entropy. The maximum-entropy principle initiated by Jaynes'[9] is a powerful optimization technique of determining the distribution of random system in the case of partial or incomplete information or data available about the system.
Wolf (2005) introduced by the Shannon entropy is a valuable and imperative measure in data handling, for example, information pressure or haphazardness extraction, under the supposition which can ordinarily securely be made in correspondence hypothesis that a specific irregular investigation is freely rehashed commonly.