Also found in: Thesaurus, Encyclopedia, Wikipedia.
A principle in quantum mechanics holding that greater accuracy of measurement for one observable quantity entails less accuracy of measurement for another conjugate quantity.
American Heritage® Dictionary of the English Language, Fifth Edition. Copyright © 2016 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
(General Physics) the principle that energy and time or position and momentum of a quantum mechanical system, cannot both be accurately measured simultaneously. The product of their uncertainties is always greater than or of the order of h, where h is the Planck constant. Also known as: Heisenberg uncertainty principle or indeterminacy principle
Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014
the quantum-mechanical principle, formulated by Heisenberg, that measuring either of two related quantities, as position and momentum or energy and time, produces uncertainty in measurement of the other.
Random House Kernerman Webster's College Dictionary, © 2010 K Dictionaries Ltd. Copyright 2005, 1997, 1991 by Random House, Inc. All rights reserved.
A principle in quantum mechanics stating that it is impossible to measure both the position and the momentum of very small particles (such as electrons) at the same time with accuracy. According to this principle, the more accurately the position of a small particle is known, the less accurately its mass and velocity can be known, and the more accurately its mass and velocity are known, the less accurately its position can be known. The uncertainty principle and the theory of relativity form the basis of modern physics.
The American Heritage® Student Science Dictionary, Second Edition. Copyright © 2014 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
Switch to new thesaurus
|Noun||1.||uncertainty principle - (quantum theory) the theory that it is impossible to measure both energy and time (or position and momentum) completely accurately at the same time|
scientific theory - a theory that explains scientific observations; "scientific theories must be falsifiable"
quantum theory - (physics) a physical theory that certain properties occur only in discrete amounts (quanta)
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
n (Phys) → Unbestimmtheits- or Ungenauigkeits- or Unschärferelation f
Collins German Dictionary – Complete and Unabridged 7th Edition 2005. © William Collins Sons & Co. Ltd. 1980 © HarperCollins Publishers 1991, 1997, 1999, 2004, 2005, 2007