Revision as of 03:54, 5 October 2006 edit216.165.30.86 (talk) →Basic explanation← Previous edit | Revision as of 03:54, 5 October 2006 edit undo216.165.30.86 (talk) →Basic explanationNext edit → | ||
Line 9: | Line 9: | ||
If one attempts to measure the difference in frequencies over a finite period of time, however, to be relatively certain with ones comparison, one would have to | If one attempts to measure the difference in frequencies over a finite period of time, however, to be relatively certain with ones comparison, one would have to | ||
#allow at least one beat of the |
#allow at least one beat of the clock, and | ||
#the frequency of the measured wave, for it to be observed within the given time interval, must be greater than or equal to the frequency of the clock, that is to say: have an equal or smaller period. | #the frequency of the measured wave, for it to be observed within the given time interval, must be greater than or equal to the frequency of the clock, that is to say: have an equal or smaller period. | ||
Revision as of 03:54, 5 October 2006
In quantum physics, the Heisenberg uncertainty principle or the Heisenberg indeterminacy principle — the latter name given to it by Niels Bohr — states that one cannot measure values (with arbitrary precision) of certain conjugate quantities, which are pairs of observables of a single elementary particle. The most familiar of these pairs is the position and momentum.
Mathematics provides a positive lower bound for the product of the uncertainties of measurements of the conjugate quantities. The uncertainty principle is one of the cornerstones of quantum mechanics and was discovered by Werner Heisenberg in 1927. The Uncertainty principle follows from the mathematical definition of operators in quantum mechanics; it is represented by a set of theorems of functional analysis. It is often confused with the observer effect.
Basic explanation
To measure the frequency of a wave one must compare the wave with a reference signal of known frequency, such as the beats of a standard clock. This is the same as to allow the two signals to interfere with each other. One will not know if the two frequencies are, or are not, exactly, precisely, the same, if one does not have an infinite amount of time to measure both and be certain.
If one attempts to measure the difference in frequencies over a finite period of time, however, to be relatively certain with ones comparison, one would have to
- allow at least one beat of the clock, and
- the frequency of the measured wave, for it to be observed within the given time interval, must be greater than or equal to the frequency of the clock, that is to say: have an equal or smaller period.
In other words, this corresponds to one or more beats of the wave per unit time of the standard clock: , which will be less than or equal to the minimum observeable frequency , or:
It follows that if is close to zero must be nearly infinite, and the uncertainty large if measured over a short time interval.
The corresponding uncertainty in wavelength is easily deduced if given the speed of the wave. The Uncertainty Principle, as it pertains to the momentum of a material particle, is inferred from experiment that confirms the wavelength of a material particle is equal to ; where is Planck's constant, and its momentum. See Planck's law of black body radiation.
Overview
Until the beginning of the discovery of quantum physics, it was thought that the only uncertainty in measurement was caused by the limitations of a measuring tool's precision. But it is now understood that no treatment of any scientific subject, experiment, or measurement is said to be accurate without disclosing the nature of the probability distribution (sometimes called the error) of the measurement. Uncertainty is the characterization of the relative narrowness or broadness of the distribution function applied to a physical observation.
Illustrative of this is an experiment in which a particle is prepared i