information entropy

(redirected from Shannon information)
Related to Shannon information: Information entropy

information entropy

n.
In information theory, a mathematical measure of the degree of randomness in a set of data, with greater randomness implying higher entropy and greater predictability implying lower entropy. Also called Shannon entropy.
American Heritage® Dictionary of the English Language, Fifth Edition. Copyright © 2016 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
Mentioned in ?
References in periodicals archive ?
The observed number of alleles ranged from 2 to 9, the expected heterozygosity varied from 0.24 to 0.86, and Shannon information's index ranged from 0.41 to 2.07, respectively for marker Borb04 present in Multiplex 2 and marker Borb15, from Multiplex 4.
Results obtained were used to estimate the level of heterozygosity and allelic diversity and calculation of allelic frequencies of all microsatellites markers, the mean observed and expected heterozygosity, the fixation index, gene flow and Shannon information index using the software GeneAlEx.
Despite Dennett's mostly straightforward writing style (and repeated folksy use of "sorta" instead of "sort of), there are numerous pages such as those about "Shannon information vs.
The authors make it clear that they are not limiting themselves to Shannon information which Claude Shannon developed to focus on communication.
In this paper, based on the Shannon Information Theory [6], we study several well-known games of the nonblind confrontation: "rock-paper-scissors" [7], "coin tossing" [8], "palm or back, " "draw boxing, " and "finger guessing" [9], from a novel point of view.
The argument, recently expanded by Haken and Portugali [24], shows that Shannon information participates in semantic information and vice versa: as the mind relates to the environment deflating/inflating information, variations in Shannon information entail different meanings.
Our quantitative interpretation of information removes ambiguities from the relation between Shannon information and semantic information, as different orientations, actions, and types of activity places are represented by numerical values.
Shannon information entropy H of node i can be obtained by
Then the Shannon information entropy H of set K can be written as
This can be illustrated by embellishing an example used by Wicken (1987) to show how Shannon information depends upon the existence of alternatives.
Hence, by choosing an equiprobable (or uniform) distribution for normalization, we are counting the true diversity, that is, the number of equiprobable types that are required to match the same amount of Shannon information H as the given distribution.
Given a probability density p(x), [C.sub.c] measures the diversity of the distribution up to a cumulative probability of c, by computing the length or support of an equivalent uniform distribution that has the same Shannon information as the conditional distribution of [[??].sub.c](x) up to a cumulative probability c.