
"What would Heisenberg’s position be?"
If I had to guess it would probably be reverse cowgirl but it's hard to be certain.
A group of Canadian PhD researchers claim to have obtained information beyond the “Heisenberg limit” using a technique called “weak measurement”. Heisenberg’s Uncertainty Principle limits the amount of information that can be known at the quantum level: the more you know about the position of an object, the less you can know …
We already know that our theories on quantum mechanics and Einstein's relativity are incomplete. Both are pretty good when compared to experimental data, but they disagree with each other. That's a big warning sign!
However that didn't stop people designing quantum cryptography systems, something that is rather foolish I suspect. Basing the security of a system on the laws of physics (which we know we don't fully understand) leaves plenty of scope for techniques such as this one to be developed.
In contrast crypto systems like AES, DES, etc. are good in that they rely on our understanding of maths and logic. That's good because we make up the rules, not mother nature. That at least gives us a better chance of understanding the rules. In the case of DES for example we know that its not very good and furthermore we know exactly why.
Crypto systems have a critical weakness: private keys have to be kept secret. If they fall into the wrong hands, the cryptography is broken.
There are also mathematical weaknesses. It is now known that not all keys are equal. Statistical techniques have been developed that make a subset of keys very much more crackable than others the same length. Of course, once such an attack is known, vulnerable keys can be rejected, but suppose there are other mathematical weaknesses that have not been made public?
Also, most modern crypto depends on the Riemann hypothesis being true. Few mathematicians think otherwise, but it has yet to be actually proved. By the way, if you ever discover a disproof, spam it far and wide and then go into hiding for a few months. It's the only way you'll remain alive and at liberty!
"Crypto systems have a critical weakness: private keys have to be kept secret. If they fall into the wrong hands, the cryptography is broken."
Well, yes, that's kind of the entire point with crypto.
The best technique is still to keep messages as short as possible, in order to give the enemy as little leverage as possible. If the key contains at least as many bits as the message, then any plaintext is equally plausible.
@ Loyal Commander
Indeed, but they are devices built around our current understanding of how these things work. Whatever else there is to say about exactly how and why a transistor works is irrelevant. We can make them, measure their performance and exploit that in a CPU design without giving a fig for an exact and all encompassing theoretical explanation for why their performance is as measured.
But with quantum cryptography its the other way round. You have to be completely confident that the explanation as to why you can't interfere with entangled particles is correct. Sure you can make devices, take measurements and observe the effects that the current theory predicts. But that's not really a cast iron guarantee that whoever dreamt up the explanation got it completely right. And we *know for sure* that we are not fully aware all of Mother Nature's rules. Almost by definition there is likely to be a hole a mistake in our understanding.
At least with mathematical cypher systems we know what the rules are: we invented them (algebra, logic, etc.). That doesn't mean that we fully understand the full impact of those rules, but knowing the whole rule set is a much better place to be when one is assessing how strong a cypher system is.
This post has been deleted by its author
According to our good friend Wikipedia (And I seem to remember, a certain Professor Brian Cox) the Observer effect and the Heisenberg Uncertainty Principal are different.
The Observer effect says that measuring an object alters it. The smaller the object you are measuring, the more important this is.
The Heisenberg Uncertainty Principal says that it is impossble to know exactly all the properties of a wave-like system.
PS When can we have an "Edit post" option ?
Yes, I expect that the vast majority of commentators on this will make the mistake of claiming that the uncertainly principle is in doubt, and totally miss what has actually been claimed.
From the first linked article:
"It is often assumed that Heisenberg's uncertainty principle applies to both the intrinsic uncertainty that a quantum system must possess, as well as to measurements. These results show that this is not the case and demonstrate the degree of precision that can be achieved with weak-measurement techniques."
The experiment addresses the phrase: " as well as to measurements." The intrinsic uncertainty remains.
Physics simplified for us non-physicists often gets a bit confused in the process, and that's what seems to have happened here. Apparently, there are *two* distinct Heisenbergian relations that have been conflated (starting in paragraph two of the article):
#1 is the famous Uncertainty Principle, which states that there is a limit to the precision with which certain pairs of physical properties of a particle can be simultaneously known, and specifies that limit. This has been proven true and experimentally confirmed to be so. This relation is not based on measurements disturbing a particle, and has nothing to do with the Rozema paper.
#2 is called the "observer effect" and involves a lower limit to the degree of disturbance of a particle by a "measurement" of that particle. It is often confused with #1, especially by non-physicists.
The confusion is not surprising, as Heisenberg himself originally approached relation #1 by thinking about measurements affecting particles' properties. In fact, for a very specialized case, he found that (measured position precision)(momentum distubance) <= 1/2 Planck's constant (that is, he found that Relation #2 = Relation #1 for one special case).
Many assume that Heisenberg's relation #1 is generally correct for relation #2. Not only has this not been proven, it has been shown that relation #2 usually does NOT equal relation #1. In particular, the Ozawa (2003) paper proposed a more involved formula as the correct limit for relation #2. Drawing on the ideas of the Lund (2010) paper as to how to test Ozawa's formula, the current Rozema paper reports experimental results verifying that:
a) Heisenberg's formula for the #2 limit is wrong and
b) Ozawa's formula for the #2 limit seems to be correct and
c) Ozawa's relation #2 limit is (at least often) less than that from Heisenberg's formula (couldn't access article behind its paywall, so I'm inferring this from secondary sources)
Why we care: quantum cryptosystems that assume higher natural uncertainty (from Heisenberg's incorrect formula) than actually exists (Ozawa's correct limit for relation #2) may miss evidence of tampering that increases uncertainty above Ozawa's limit but keeps it below the level of Heisenberg's formula).
Useful sources used for the above:
Wikipedia's entry on "Uncertainty principle" (surprisingly unbad)
The Lund(2010) article which inspired the recent work (http://iopscience.iop.org/1367-2630/12/9/093011/)
From the first linked article: ...etc
So, you read the original? Presumably it doesn't claim that the uncertainty principle is "wrong"? I can't be arsed to read it myself.
The Reg take is certainly very confusing. I'm not aware that polarisation has a complex conjugate property, so "uncertainty" isn't relevant anyway.
Question now: Isn't "polarization" actually the spin (or rather, its direction)? And is that even a variable subject to an "uncertainty principle"? Doesn't the spin operator commute with the position or momentum operator (I think it does)? So aren't you are quite free to measure the heck out of spin without changing or affecting position or momentum at all?
Note that the "Heisenberg Uncertainty Relation" is not magic. In a Schrödinger Wave Equation is appears quite naturally as the tradeoff between the localization of a function and its Fourier transformtion (strong peak in the time domain leads to wide bumb in the frequency domain and vice-versa), it is thus a mathematical consequence. Richard Feynman can do without it in his explanation of QED as it is a natural consequence of the sum-over-all-trajectories, and I cite:
"This is an example of the 'uncertainty principle': there is a kind of 'complementarity' between knowledge of where the light goes between the blocks and where it goes afterwards - precise knowledge of both is impossible. I would like to put the uncertainty principle in its historicla place: When the revolutionary ideas of quantum physics were first coming out, people still tried to understand them in terms of old-fashioned ideas (such as, light goes in straight lines). But at a certain point the old-fashioned ideas would begin to fail, so a warning was developed that said, in effect, "Your old-fashioned ideas are no damn good when..." If you get rid of all the old-fashioned ideas and instead use the ideas that I'm explaining in these lectures - adding arrows for all the ways an event can happen - there is no need for an uncertainty principle!"
This article (sorry, Richard Chirgwin) just as the original one in PRL fudges the physics and doesn't teach anybody anything. So let's clear up some things:
Some single particle properties can be determined in a single measurement (e.g., measuring the position of a photon with a photographic plate). This is called a 'strong measurement', because a single measurement offers meaningful information. Heisenbergs uncertainty limit tells us, that the outcome of such an experiment cannot be predicted to better that ΔxΔp > ħ/2, or ΔEΔt > ħ/2. A famous example is the detection of a photon behind two narrow slits: I cannot tell through which slit the photon came, because the narrow slits (information about position x) induce uncertainty about the momentum (information about the direction of flight).
A 'weak measurement', on the other hand, just modifies the particles a bit to affect the outcome of a strong measurement. In the double-slit example, we could add a slightly tilted polarizer in front of one slit. When I analyze the polarization of photons coming through the slits, I can now distinguish right-slit / left-slit photons by polarization. To do so, I need to observe multiple photons, because the polarization just modifies the detection probability behind another polarizer plate by a little bit. Let's say, I collect 1 million photons to statistically distinguish the polarized from the unpolarized photons. Now I gained information beyond Heisenbergs uncertainty limit: I determined the slit position and the flight direction of the photons!
But it's just a cheap trick: I gained extra information by statistics, and generally the uncertainty of a measurement is reduced by factor sqrt(n) for n measurements (sqrt(1 million) = 1000 in the example above). Heisenberg only told us the uncertainty for the measurement of one quantum particle, he didn't restrict us from doing multiple measurements to reduce statistical uncertainties. (The outcome for any single measurement still fulfills Heisenbergs uncertainty limit).
If you look into the original article, you'll note, that the authors only talk about violating: <‘‘Heisenberg’s uncertainty principle,’’ Heisenberg originally formulated.> Note the quotation marks, this is not the same uncertainty principle you will find on Wikipedia.
Something similar is done in the field of 'super-resolution microscopy', where a particle position is determined with an uncertainty far below the Abbe limit (== Heisenberg uncertainty principle applied to microscopy) by performing multiple measurements, or a highly nonlinear measurement (which is the same thing).
Learned something? Then feel free to google our real stuff under "correlated rotational alignment spectroscopy (CRASY)".