information entropy


Also found in: Wikipedia.

information entropy

n.
In information theory, a mathematical measure of the degree of randomness in a set of data, with greater randomness implying higher entropy and greater predictability implying lower entropy. Also called Shannon entropy.
Mentioned in ?
References in periodicals archive ?
Our study used the information entropy to measure the uncertainty of the RFID data streams, and analyzed the features that could influence cleaning effect.
According to the Shannon's information entropy theory, the information entropy is defined as S = -k[N.
Many scholars tried to tackle this problem by various techniques like Information Entropy Weight method (El-Santawy, 2012), the weighted average operator (OWA), and several other methods (El-Santawy and Ahmed, 2013).
Using this mathematical connection between thermodynamic entropy and information entropy, Vieland and colleagues show how to treat statistical data as a gas governed by the laws of thermodynamics.
This parameter can be assessed applying information entropy for evaluating the accuracy of the raster scale.
This general formulation of the concept is known as information entropy.
The unrestricted use of the Internet leads to what Floridi calls information entropy (Floridi, 2002).
The paper demonstrates that information entropy provides an objective measure of project uncertainty.
The information entropy for a particular experimental condition with a set of M possible outcomes is:
The experiments show that the resulting image quality is comparable with the popular algorithms, or even gets higher information entropy and contrast of the fused image.
An Information Entropy Weighting (IEW) method is introduced for the criteria of selection.

Full browser ?