entropy


Also found in: Thesaurus, Medical, Legal, Financial, Acronyms, Idioms, Encyclopedia, Wikipedia.
Related to entropy: enthalpy

en·tro·py

 (ĕn′trə-pē)
n. pl. en·tro·pies
1. Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
2. A measure of the disorder or randomness in a closed system.
3. A measure of the loss of information in a transmitted message.
4. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
5. Inevitable and steady deterioration of a system or society.

[German Entropie : Greek en-, in; see en-2 + Greek tropē, transformation; see trep- in Indo-European roots.]

en·tro′pic (ĕn-trō′pĭk, -trŏp′ĭk) adj.
en·tro′pi·cal·ly adv.

entropy

(ˈɛntrəpɪ)
n, pl -pies
1. (General Physics) a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvin. Symbol: S See also law of thermodynamics
2. (General Physics) a statistical measure of the disorder of a closed system expressed by S = klog P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
3. lack of pattern or organization; disorder
4. (Communications & Information) a measure of the efficiency of a system, such as a code or language, in transmitting information
[C19: from en-2 + -trope]

en•tro•py

(ˈɛn trə pi)

n.
1. a function of thermodynamic variables, as temperature or pressure, that is a measure of the energy that is not available for work in a thermodynamic process. Symbol: S
2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal.
3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature.
4. a state of disorder, as in a social system, or a hypothetical tendency toward such a state.
[< German Entropie (1865); see en-2, -tropy]
en•tro•pic (ɛnˈtroʊ pɪk, -ˈtrɒp ɪk) adj.
en•tro′pi•cal•ly, adv.

en·tro·py

(ĕn′trə-pē)
A measure of the amount of disorder in a system. Entropy increases as the system's temperature increases. For example, when an ice cube melts and becomes liquid, the energy of the molecular bonds which formed the ice crystals is lost, and the arrangement of the water molecules is more random, or disordered, than it was in the ice cube.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.entropy - (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
communication theory, communications - the discipline that studies the principles of transmiting information and the methods by which it is delivered (as print or radio or television etc.); "communications is his major field of study"
information measure - a system of measurement of information based on the probabilities of the events that convey information
2.entropy - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
physical property - any property used to characterize matter and energy and their interactions
conformational entropy - entropy calculated from the probability that a state could be reached by chance alone
thermodynamics - the branch of physics concerned with the conversion of different forms of energy
Translations
entropie
entropia

entropy

[ˈentrəpɪ] Nentropía f

entropy

[ˈɛntrəpi] nentropie f

entropy

nEntropie f

entropy

[ˈɛntrəpɪ] nentropia

en·tro·py

n. entropía, disminución de la capacidad de convertir la energía en trabajo.
References in periodicals archive ?
Finthammer develops, implements, evaluates, and improves the very first algorithms tailor-made for solving the maximum entropy optimization problem under aggregating semantics.
Keywords: entropy measure, distance, similarity measure, interval valued intuitionistic fuzzy sets
This study aimed to determine hysteresis, enthalpy, entropy, enthalpy-entropy compensation theory and Gibbs free energy related to water adsorption and desorption in 'Malagueta' pepper seeds.
Even for [5] method having a great performance, the usage of finite values of entropy for fault identification leads to a narrowed field of approach.
Strange, surreal, and symbolic, Aaron Costain's graphic novel, Entropy, probes some of the deepest questions about creation with a funny, absurd sensibility that makes it all go down smoothly.
Clausius coined the term of entropy in 1865 and explained the preference for this word by its etymology.
Part of theories on human consciousness, the concept of entropy has become a greater research focus with recent improvements in the ability of functional magnetic resonance imaging (fMRI) to track chemical activity patterns in the brain.
Different graph invariances have been used to develop image entropy measures such as eigenvalue and connectivity information [20], distance-based graph entropy [21], and degree-based graph entropy [22].
In the past few years, the distance, similarity, inclusion, and information entropy measures for IVIFSs were very important topics.
Fu, "Microstructure and compressive properties of AlCrFeCoNi high entropy alloy," Materials Science and Engineering: A, vol.
In the present work, we examine the validity of GSLT by assuming various forms of entropy on apparent and event horizons.
Entropy (3) was first of all characterized by Havrda and Charvat [1].