information entropy

(redirected from Entropy (information theory))
Also found in: Wikipedia.
Related to Entropy (information theory): Shannon's entropy

information entropy

n.
In information theory, a mathematical measure of the degree of randomness in a set of data, with greater randomness implying higher entropy and greater predictability implying lower entropy. Also called Shannon entropy.
Mentioned in ?