quantum vacuum is a superfluid,
we need a name ...
I am a heretic. There, I've said it. My heresy? I don't believe that quantum computers can ever work. I've been a cryptographer for over 20 years and for all that time we've been told that sooner or later someone would build a quantum computer that would factor large numbers easily, making our current systems useless. However …
I will read it again when I have had coffee, but one things spring to mind.
Does this mean that quantum computing is provably impossible? The implication is that an underlaying analogue system might limit the ability to build a machine showing quantum characteristics.
Then again, I get the feeling that somewhere in Turing and Godel there are some deeper truths about the nature of this universe....
There is a strong hint RA is wrong in that in the Maxwell equations and classical mechanics, information propagation is limited by the speed of light. Quantum mechanics wave function collapse is not (spooky action at a distance).
Given that "spooky action at a distance" has been observed experimentally, and there is no way that the experiment in this paper could replicate that, I think that it must be wrong.
The mathematics referenced precedes Einstein, so I think was obviously incorrect as of 1905.
Another reason why Computer Science belongs along side other sciences like SPS, Sports and Food science. You really can make up crap on the fly and be right most of the time when working with a computer. Mathematics and Physics are a lot harder.
all this "information traveling faster than light", is easily explained if there are 11 dimensions as hinted at by other research.
It just means that the particles are connected in another dimension at a shorter distance that the distances we are observing in our classical models.
It may be the particles appear to be significant distances apart in 'our' dimension, but in one of the other 11 they are actually next to each other, so overall nothing is breaking the 'golden rule' of faster than light.
Even visiting the opposite side of the Earth can be done via two paths, over the surface of the Earth & thru the core., passing network cables thru the core would appear to break the golden rule of traveling faster than light, when viewed from the perspective of cables traveling over the surface.
Entangled particles have just found a short cut for 'communication...' that is all.
If ever actually shown experimentally, the additonal dimensions of Bosonic string theory would be in additon to the standard 3 we are used to, so the distance metric usually won't get affected like that (certainly not at low mass/energy density).
In any case, that has nothing to do with the wavefunction collapse. There can not be a hidden variable in QM without contradicting the basic axioms of QM (Bell inequality of the EPR experiment). That the paper doesn't include action at a distance, introduces a hidden variable, and doesn't attempt to go near 11 dimensions at any point as a justification.
I am a bit annoyed that an academic with Ross Anderson's credentials has strayed into this kind of territory. It is unprofessional to write articles that are this controversial in a field that you are not an expert in. It's no better than all the "experts" on climate change. I would prefer this to come from an expert from the Physics/Mathematics department (even in collaboration with RA).
Spooky action is what you have to assume if Bell's inequality is a correct description of reality and there's no hidden variable carrying the entangled state. But if this theory can supply that hidden variable then superluminal information transport is not needed, the state was always there waiting to be measured.
Got to say it's a hard sell, but not necessarily harder to believe than other interpretations of quantum behaviour.
Can I say 'pilot wave theory' and suggest this is not exactly new.
"The implication is that an underlaying analogue system might limit the ability to build a machine showing quantum characteristics."
Maybe if they approach quantum computing from the POV of analogue computers, which can be very fast in their own right for their own specialised applications. Are there still experts in analogue computing? Is it even a field of study these days?
Not really.. The article is an overview for I'm-but-an-egg's like me.. The Threat of Math is in every single line...
Compare it with the demonstration of a standing wave using a piece of rope.. Yes. it can be demonstrated and even proven by Math, but it's a hell of a lot easier on the laymen to use the piece of rope, and still be right.
Physics is mathematics applied to describing the world. Leaving the math out simply means that the article has been dumbed down sufficiently to contain no hard physics.
Now what was the point again?
Physics is mathematics applied to describing the world
No, physics is the science of finding the laws and entities that govern the fundamental interactions of the world. Mathematics is part of this, but physics is more than applied maths.
Newton's laws and the laws of thermodynamics do not involve mathematics (if you are going to claim that "equal and opposite" is mathematics then I would argue that you are carrying out a mathematical landgrab on philosophy and semantics.) The mathematics arises from the attempt to give quantitative description to those laws, but it is not the laws.
Re quantum vacuum as a fluid. I can never quite get my head around the practicality of quantum physics as a water tight explanation for the subatomic, but in my mind the 'way things might work' is that all energy is a kind of eddy current in a quantum fluid, and that mass / matter is a stable form of this eddy current. Some of the quanta in this fluid act rather like stringy knots in a current that are easily transformed from one shape to another, whilst others form long-lasting stable configurations that are much harder to break down. For example a photon of light is reasonably 'transformable' into another energy state when interacting with something else, where as a Higgs Boson has a stable shape that is so small and untenable that it doesn't break down any further except to energy.
On a subatomic particle level a stable eddy current is perhaps analogous to a perfect whirlpool, which explains the spin state of quarks. Likewise the harmonics of stable and unstable atomic configuration shapes how atomic particle fit together, electrons could be orbiting whirlpools around the central atomic whirlpool (Jupiter red spot).
Mass is effectually not really a measure of the actual matter within an object but more the displacement of the object (these eddy currents) against the quantum fluid, this helps explain gravity as matter will seek the most stable configuration and 'packing' and so over time will 'gravitate' towards each other. (Ships in a closed dock will tend to move together as they bob up and down slowly forcing any separating water between them to be reduced). Likewise the displacement of light in gravitational lensing etc.
I could probably waffle on for hours over a pint or two of IPA about my understanding of the world, universe & everything it is only my theory ;)
knot quite ;) an eddy current could be seen as a string as it has a linear progression but its isn't a string in the sense it has a finite start and end point. A eddy of this form might be like an ever rising spiral of smoke from a fire, perhaps light photons work like this, when seen from above / head on the spiral just shows as a circular form, but from side on in 2d just a sinusoidal wave structure. Because the spiral structure is never closed it never displaces any space and therefore has no mass.
I'll be convinced when the maths (and ultimately experiments) are in for this new understanding of quantum mechanics. If this is ultimately correct then, ipso facto, we've also a solution to the TOE—we've a solution which connects the quantum world and relativity.
Now that would be very nice.
An aside: this article keeps referring to 'fields'. Now, I want to grab some 'field' and store it in a bottle to look at and analyze. So would someone please tell me precisely what a field is. What exactly is a field?
A Nobel Prize for a correct answer, perhaps?
"Look up "scalar field" and "vector field" in Wikipedia.
I did the mathematics for that in algebra once, that's the mathematical analogy that they drum into us in uni physics—sure that works in the practical world, it lets us calculate all sorts of things but it STILL doesn't tell me exactly what a field is.
You tell me, what's inside that 'stuff' that say 'connects'—'glues' my fridge magnet to my fridge? Take the case of a permanent magnet, it needs no power yet it's not perpetual motion, nevertheless the force the field exhibits/impresses on susceptible surroundings (iron) holds the magnet there indefinitely.
OK, not happy with this perspective, then invert the question. How does a force work, what inside it that makes it manipulate things at a distance? Right, we can use mathematics, classical physics/Maxwell etc., quantum electrodynamics or whatever you want to provide various high-level explanations of 'action' that manifests in what we physically perceive as a force, but it still doesn't tell us what's actually inside a force or what it consists of that makes it act the way it actually does.
The same argument applies to 'power', it can only be described mathematically, and even then only in terms of something else, i.e.: P=VI. OK, so I now know that Power equals [be described in terms of] Volts x Amps, but I'm still none the wiser!
Words like force, field and power are essentially the physical equivalent of the philosophical concept of the 'simple notion' which states a notion cannot be reduced further, for example the concept of 'yellow' cannot be reduced. You can describe it as a certain wavelength of electromagnetic radiation and also do that mathematically if you like, but it has absolutely no meaning for a blind person who has never seen before and who wants to experience what others perceive as 'yellow'. Unfortunately, this is all we do when we describe a field.
Fact is, with our present knowledge of physics, no one, it seems, really has a clue. It's why I asked the facetious question in the first place.
I hate to be a party pooper because you've made a bit of an effort here but what you've described is the distinction in philosophy between "things in themselves" and "things as they appear". Science concerns itself with the latter. The former is unknowable and likely transcends the limits of Human understanding. That's why physicists don't like to discuss it. It's kind-of pointless.
Mathematics is a language used to condense/compress descriptions of the regularities of our experience. I don't believe there's any sense that it "is" what it models. It's just a kind-of shorthand for what we perceive to be the case.
As you can tell I'm not a Logical Positivist.
Ah, is that tree in the quad really there? Well, it depends....
Huh, Logical Positivism indeed! OK, OK, be nice or I'll throw my long-owned, well-dogeared copies of Ayer's Language, Truth, and Logic and The Problem of Knowledge at you!
I've always wished for Shakespeare's pithiness and succinctness but alas it's eluded me. In my long-winded way the point I was making, or more correctly, trying to make, is that these complex problems are profoundly disturbing to some people, as they possess a worldview which instinctively gravitates to examining them from a "things in themselves" perspective.
Reckon this way of thinking has little to do with intelligence, rather it's a mode of thought, whether it's appropriate here is perhaps open to debate (subject to which side of the philosophical divide one sits, that is). I may be wrong but I can only ever envisage Dirac thinking about physical "things as they appear". On the other hand, and from his bio Surely You're Joking Mr Feynman, Feynman, occasionally, seems to embrace the former, even if it does 'transcend the limits of Human understanding'—just let's say it: things that are intrinsically metaphysical by nature. Mind you, there's no doubt that Feynman was more than sufficiently adroit to instantly flip into whatever mode of thinking that was appropriate for the occasion.
If you think physicists avoid discussions on matters concerning "things in themselves", then they pale in comparison to the attitudes of engineers when similarly confronted; for them, it's strictly application of quantum rules, any doubts about quantum weirdness or metaphysical issues are simply bedamned notions.
It would be wrong to assume that my worldview defaults to the former view, it doesn't, not usually anyway (as I've been in and out of engineering for far too long). The fact is this glaring example of fields and forces is just too striking to ignore, it perplexes many through its apparent striking anomalies and the seeming lack of a coherent classical understanding. (In perceived importance, perhaps it could be considered the 'double-slit experiment' of electronics and electrical engineering.)
As I see it, whilst mathematics is the default lingua franca of physicists, it's not necessarily so for mere mortals. Thus, any explanation of the seeming paradoxes that the problem unleashes ought to deserve considerably more attention. If that involves physicists having to provide a more painstaking explanation of the mathematics involved, then so be it.
That 'it works in the practical world' is the point: that's what physics is about. We try to invent a coherent and elegant mathematical model which explains what we observe in experiments. Once we've done that we stop (or, in fact, we carry on doing more experiments to try and find cases where our pretty model doesn't work). Philosophers can worry about what the model 'means' in some sense: we just worry about the elegance and the right answer bits.
And the notion of a field – a rule which assigns a quantity to every point of spacetime – is such a useful model. That's all.
If someone is using the term "field" in the context of quantum mechanics, it can be useful to think of an analogy, such as: If you see one corn stalk growing on a patch of land, you would probably say that there was only the one stalk. If the same patch of land is covered by corn stalks, you might say that you'd seen a "field" of corn. In a super-symmetric Universe, all forces, such as the electromagnetic, strong nuclear, and Higgs, have particles associated with them--photons, gluons, the Higgs particle. The "field" is the sum of all the relevant particles acting in a particular (no pun intended) way. This is why one of the goals of the LHC is to discover the graviton, the force carrier for the gravitional field, if possible.
A very simplified description, I know, but I hope it helps those interested to clarify the concept of a field.
Right. As a kid I remember asking my father how a magnet worked, even then it was obvious to me that his description of 'lines of force' was inadequate as it was only a representation. Later, this basic concept gave way to more detailed terminology, equipotential surfaces, transformations, eigenvalues/vectors etc., which provides a much more precise mathematical interpretation.
Your "field" of corn analogy makes sense, and if the stalks were waving in the breeze we'd have a representation of its dynamics. In other words, a 'field', as we describe it mathematically, is a representation of the 'state' of underlying forces. Nevertheless, even in quantum field theory, we still have to resort to aids such as Feynman diagrams to represent a 'system', explaining the exact underlying reality still eludes us.
As you say the "field" is the sum of all the relevant particles acting in a particular way', but even if the LHC does discover the graviton (or anything else) and we end up with an even more precise understanding of nature, does that mean we will be able to truly describe what a force is, when we already use the term 'force' to describe how this underlying system works (describing something in terms of itself is, logically, nonsense)? Seems to me the only solution is to do what we've always done which is to again resort to mathematics and let the equations provide the description.
Let me give you an example of a common but difficult conceptual problem that pops up regularly and which is never satisfactorily explained by physicists except in terms of mathematics. The common description of light is photons in the form of electromagnetic radiation [a complex issue in itself] radiating at speed c from some generating source such as a candle, the process being initially driven from an energy source such as paraffin etc.
Leaving aside how the stored energy is converted into light for a moment, the description of light becomes more problematic when lambda becomes infinite (at zero frequency—the 'DC' condition). In the 'DC' state, a DC electric current is the source of energy for say an electromagnet which generates a magnetic field which fills surrounding space at speed c. Same with a permanent magnet which once stationary requires no input energy—here for want of a better description we have a system in stasis (for example, my fridge magnets just stay put). The question everyone asks is where do the photons hang out a static system, how, so to speak, can they seemingly hang around in 'mid air' at speed zero when they're supposed to travel at speed c? Same goes for charges on pith balls, and in capacitors. If the energy stored in a capacitor's dielectric isn't in the form of photons then what is it?
Ahh, so now we're back to the 'field' problem again, exactly how are fields involved in the conversion of energy into photons?
Again, mathematics comes to the rescue, but conceptually how a photon generated is a complex and difficult problem to get one's head around—one that's never properly explained except in complex texts at the higher echelons of physics. Even then, we're back to the fundamentals of having to describe the process terms of 'fields', 'forces' etc. It's mathematics all over again.
Physicists have to do more to make this stuff much more comprehensible.
Your comments and questions are very cogent, however, I must point out that I was trying to explain what a 'field' is, and not really trying to describe how such fields function in the real (macro) world. Part of the trouble with understanding the concepts of fields and forces and suchlike is that nobody yet fully understands how they function,so explaining it to someone is, by necessity, an exercise in theories (as in, which is your favorite flavor of theory?)
To speak directly to your question "how are fields involved in the conversion of energy into photons?", the partial answer is that we look at the basic production of a single photon, often caused when an electron is excited to a higher energy level, then emits a photon as it falls back to its ground state. When a bazillion electrons (an electric field) do the dance, we get a light source of some kind. I say "often caused" since there are other ways that photons can be created.
Explaining the existence of a static magnetic field is very hard because there aren't any words in our language to describe it. The nearest analogy I can think of is a spark-gap in an electrical circuit. Put enough potential across the gap and the circuit will close, creating an electrical arc and completing the circuit even though there is no wire in the gap. Line up electrons in a magnetic material like iron and the "circuit" closes through the magnetic poles. I know this is not a satisfactory explanation but after all, it's still an open question in quantum mechanics!
Your insightful last post in effect wraps up the discussion I reckon. Your comment succinctly summarises the original point I was trying to make about aspects of fields being incomprehensible except through mathematics. Anyway, it seems what started with my casual and facetious remark about a 'Nobel for 'fields' has generated more words than I expected.
As mentioned, the notion of fields, through various levels of understanding, has troubled me for decades since as a kid I asked what magnetism was, and in ways it still does. Whilst the double-slit experiment rightfully captures popular imagination as a means of demonstrating quantum weirdness, it seems to me a similar case can be made with respect to force fields. Especially so since it's a ubiquitous aspect of physics in the sense that most physics students, electrical engineers, etc., not only can't avoid it, but as with thermodynamics, it's a central pillar of their work/profession.
Moreover, I cannot ever recall any intermediate-level text making a strong point about the striking dichotomy between static, non-excited, force fields (magnetic, electric etc.) and those of far-field EMR. Of course, by that I'm referring to both our perception of and our seemingly quite different explanations of these physical phenomena—yet clearly for nature, they're just manifestations of the same thing; she has no concerns that we've difficulty in explaining them, and or that we often do so using inconsistent or very different approaches for each phenomena.
Ages ago, I recall searching* various classical physics and QM texts for various explanations of what happens at the instant a static magnetic field begins to move (is accelerated) with respect to an electron or current-carrying wire (specifically the classical case—where EMR/photons begin to be emitted and the instant when λ ceases to be meaningless/'infinite'); and it was exceedingly difficult to find any direct references to the matter. One is left to figure it out like an exam question from the more general explanations which can be very difficult. (I ended up in an entangled (no pun intended) mathematical mess; Lorentz and many other such matters to consider, even the relativistic case was in the mix—right, my attempt failed.)
I failed again tackling the problem from the QED end, one easily gets bogged down (well, certainly I do) in complicated conceptual matters, and the maths is persistently mind-boggling (heaven knows why I bothered). The basics of QM photon generation/emission is documented but to fully understand the intricacies of the coupling/interaction of e‾ to the magnetic static force field at the moment of initial acceleration is, pretty much, beyond this mere mortal's capability. We've to consider such notions as virtual photon exchange, perturbation theory, QM's non-relativistic and relativistic Hamiltonian for both free and bound electrons in a static force field etc., etc. If it's not one's primary bread and butter, then it's damn heavy going. (Oh, to have Dirac/Feynman's mind!)
I note also the popular press has recently dipped its toes into the question of what exactly is a field. In a short, somewhat uninformative article on fields New Scientist (No 2999, 13 December 2014, p39) says of a field that "On the one level, it is just a map"; and even MIT's Frank Wilczek gets into the act who's quoted as saying "Ultimately, a field is something that depends on position".
This leaves me little the wiser. I'd have thought that with Wilczek's renowned stature in such matters, he could have put up a marginally better performance. It's hardly very informative given that the article's opening paragraph quotes Newton's dismissive position on the matter of fields. But then, Wilczek, like Newton, is also dismissive.
As Robinson ruefully points out in his post, that speculating over the philosophy of QM phenomena is a waste of time: "[it] is unknowable and likely transcends the limits of Human understanding. That's why physicists don't like to discuss it. It's kind-of pointless." Perhaps so, (and I essentially agree with that position, especially so from the position of getting things done); but I doubt seriously if many (even some scientists) will ever refrain from doing so on the grounds that it's pointless. It's just too alluring for many (and sometimes it actually delivers results).
Of course, the other aspect of this argument is that physicists aren't lily-white either. As a matter of course, they do things that would give philosophers apoplexy. They've little qualms about 'creating' virtual particles that run backwards in time, exceed c and perform other strange 'illegal' non-real-world scenarios, and then there's also perturbation—say no more! [OK, OK, I know!] Moreover, they've been performing these 'kluges' for a very long time, Planck ultimately did it back ca 1900 as a last resort and out of sheer desperation to solve the Ultraviolet Catastrophe and, voila, now very thankfully we've h. So I'm not against such approaches by any means, as it usually works somehow (a bit like those parametric equations one learnt about in one's early schooling). If I'd been Planck, I'd have done exactly the same (but that's nonsense, I'd have never dreamt up such an ingenious solution anyway). What it certainly does show is how truly brilliant and innovative Planck really was.
Thus, it doesn't really matter if people speculate or theorize wildly; scientists will carry on doing their day-to-day work in their usual, procedural, by-the-book way. But very occasionally philosophising will yield results and someone will arrive at a brilliant idea (after all, Einstein famously speculated in a tram if I recall and now look at the results).
Anyway, seems we've come full circle. It's unlikely we've gained a better understanding of these closely-related electromagnetic field phenomena than we had before we started the discussion, but it seems to me what's key and really significant is how vastly different our approach needs to be towards each for our ultimate understanding of the physics that's involved. If nothing else, these two phenomena are an excellent illustration of how seemingly idiosyncratic the quantum world is. Whether or not we want to, they force us to think about the problem.
A great story and great posts.
…Now I can sit back and wait to see whether QM is analogue or not! ;-)
* BTW, during that earlier search, I ended up ferreting out old original copies of that eccentric genius Oliver Heaviside's expansion and reformulation of Maxwell's 1873 masterpiece, 'A treatise on electricity and magnetism'. Heaviside's own three-volume set on the subject titled 'Electromagnetic Theory' (1893 if I recall), is also a masterpiece, albeit dense and heavy going. Those with an understanding of Maxwell's equations and who've an interest in the history of the subject ought to at least peruse them. In the grand schema, this is an extremely important mainstream document, as in it Heaviside essentially modernizes Maxwell into the form that we use today. Also, it introduces much of the terminology (absent in Maxwell) that practical scientists and electrical engineers use today (they'd be forever thankful to Heaviside if they knew these were his ideas, but most don't). There's also Heaviside's earlier work 'Electromagnetic Waves' (1889) which may also be of interest.
Seems these days Heaviside's a little lost to history which is a shame. It's often what happens when you fight the scientific establishment/established orthodoxy, which he did (remember it took Galileo quite some centuries to be 'redeemed' for similar reasons). Nevertheless, he features heavily in my old books on radio theory; had the Kennelly–Heaviside ionospheric E layer named after him; and even the 'Heaviside layer' features as the subject of a song in Lloyd Webber's musical Cats.
For example, the Nobel prizewinning physicist Gerard 't Hooft argues that we may be in a virtual universe,
I'll go for that! If you were to write a program for a simulated universe you'd first create some arrays. An "intelligent" number in one position might be able to detect that he had a neighbour but would never be able to work out why. The array itself would exist in another plane of existence or dimension if you like.
Maybe this explains Dark Matter. I'm off to the pub, Monday is old farts night.
Send my Nobel Prize quickly, I need the cash :)
So the oil drop system he describes behaves like a quantum system. This means that somewhere, quantum indeterminacy, aka the uncertainty principle, has been introduced. The article says this is the result of the wide area forcing field.
Presumably, then, this can be modelled on classical computers.
Surely what he calls the "failure of quantim computing" is actually the technical difficulty of mustering enough qubits with current technology. This does not sound like a failure in principle.
My takeaway from the article:
Point 1. A classical system can behave just like a quantum system. What we think of as "quantum" systems may actually have classical underpinnings.
Point 2. The failure of "quantum computing" may therefore be that the underpinning is in fact a classical system and not "quantum" at all. In this case, adding more qubits does not help any more than adding conventional computing units.
"Surely what he calls the "failure of quantim computing" is actually the technical difficulty of mustering enough qubits with current technology. This does not sound like a failure in principle."
I'm inclined to agree - not that counts for anything, of course.
A linear array of nine qubits is a step forward from the three mentioned in the article. But it's still a a long way short of what is required.
I have a little heresy for the author, as he's a crypto-dude: Something not unlike Fermats Little Theorem, coupled with a little mathematical jiggery-pokery may just reveal something interesting about primes and pseudoprimes and provide a clue about how to reduce the complexity of factoring some mahoosive numbers (but I would not dare repeat it out loud, as I get laughed at enough elsewhere as it is - although it was inspired by another dude (now deceased) who, as it happens, was rather intelligent).
Anyway, I'll get my coat and head back into my padded cell. My meds are calling.
While the topic itself is both fascinating and way above my knowledge, I find it a bit surprising that the author of a paper is writing an article describing his discovery and defending his previous paper from critics...isn't that what peer review is about? I thought The Register was about "Independent news, views, opinions and reviews"
"We're not passing ourselves off as a peer-reviewed scientific journal here."
Perhaps more telling, however, is that there does not appear to be any mention of a peer reviewed journal at any point. You can post pretty much anything you like on arXiv. When a person who has previously published research properly in journals starts posting things outside their field only on arXiv, it certainly raises an eyebrow or two. Also of note is that Robert Brady has a total of 4 articles listed on arXiv, all addressing this topic, all co-authored with only Ross Anderson, and none actually published in any journal.
As for the actual subject of the paper, I don't see any support for the claims made. It's essentially an argument by analogy - we can make equations that look similar to those governing quantum mechanics and observe some effects that look a bit similar to quantum mechanics, therefore it must be exactly the same as quantum mechanics. This would be like taking the equations governing relativity, rearranging them to have a similar form to those of Newtonian mechanics, and then saying that because they look similar relativity can actually be derived directly from Newtonian mechanics.
In fact, that sort of thing is behind a lot of confusion in laypeople about relativity. Concepts like relativistic mass are almost entirely useless in actual physics but they became popular because they made for a nice analogy with Newtonian mechanics, for example allowing you write p=mv in both. The problem being that because in relativity that m is a function of v, the equation actually ends up meaning something completely different and behaving in a different way. The analogy makes it look nice and familiar, but obscures efforts at actually understanding. As far as I can tell this is exactly what Anderson is doing here. He can make equations that look similar and have similar results in some conditions, but that does not actually mean they are actually the same as quantum mechanics.
By all means suggest a suitable peer-reviewed journal that would take such a paper! The difficulties of getting anything that challenges the established view through peer review are very well known, especially in an area like quantum theory, where the only people who are going to asked to review are those who have made a name by adding to the established view.
The difficulties of getting anything that challenges the established view through peer review are very well known, especially in an area like quantum theory, where the only people who are going to asked to review are those who have made a name by adding to the established view.
The Bogdanovs didn't seem to have much trouble. Ditto any of the many similar incidents involving other ostensibly blind-reviewed hard-science journals.
I know - not really fair. The reality is that some unorthodox work that is likely worthy of further consideration by the field meets with resistance, and some unorthodox work that is bogus (often patently so) slips right on through. I don't think anyone's determined what the various ratios are. Peer review and other acquisition-related editorial functions are important gatekeepers, but they are not particularly reliable ones.
Unfortunately, I've yet to see anything better.
And then on top of that we have publication-fee journals, journals with subscription fees so high that they have very poor circulation, open-access journals operating with insufficient resources...
We're not passing ourselves off as a peer-reviewed scientific journal here.
So what, look at all the verbiage I've written as a consequence! As I've endeavoured to point out, the interpretation of QM is extremely esoteric and still open to interpretation. Any vigorous debate is only to be welcomed.
There's a plentiful supply of test numbers. Write up a qualification test program and 'Yes/No' demo the damn things already.
Or is the whole industry like Schrödinger's feline friend™?
..."We could tell you if they work or not, but then we'd have to kill you."
Implied by, but not explicitly stated in, the article is the caveat that Bell's theorem actually assumes the usual rules of special relativity, so only local actions are allowed. I'm therefore not at all surprised to hear the claim that you can reproduce quantum wierdness if you allow non-local interactions.
The curious thing, for the social scientists to look at perhaps, is that when physicists are presented with the experimental facts, they are more willing to accept the fundamental unknowableness of mainstream QM rather than the idea that action might happen at a distance. They are happier to postulate a universe that has no underlying reality rather than one that is merely spooky.
Agreed. Every time I try to understand the reasoning behind the Bell theorems and inequalities, I can't help but think "yes, but surely" and then struggle. I realise that's the default reaction and that much better brains than mine have been trying to find the detail of the "yes but" since they were first proposed. To me there's a big non-sequitur in the logic.
Really interesting article, more like this please. And my vote is that we are in a virtual universe... but so what.
And my vote is that we are in a virtual universe... but so what.
Now don't be silly, that would have HUGE implications. If true, there might exist cheat codes that once found, would let you instantiate a fresh pint as easily as monsters can be spawned in DOOM...!
Midnight at the Well of Souls
Or for a more-recent, and more explicitly "we live in a virtual world", treatment of the same topic, there's Scott Meyers' Off to be the Wizard (which now has a sequel I haven't gotten to).
It's a first novel, and at time it shows (primarily in a tendency to try to lay too much groundwork and explain too many details), and Meyers often doesn't relax enough to let his normally deft hand with dialog show. But it's pretty clever and readable, if you can get into the basic nerd-power-fantasy plot line, and has some genuinely fun moments, particularly once the plot ramps up in the second half.
the idea that action might happen at a distance
Because "admitting" that will lead you to the interesting idea that action can happen backwards in time and fuck you up six ways to sunday (just change the reference frame), so you better let it drop. And then you do your experiments and find that your system under observation clerarly cannot have any classical state before you squirt the classical bits out of it (see this) and you have to move the paper-writing hidden variabilists, oil droplets and all, to the "fun but not really relevant" category. Like with the winners of the special olympics you politely applaud but you don't tell them they did something amazing or shed new light on stuff. Bohm was there, he tried and came up with a Rube Goldberg device which is just a conceptual clusterfuck, only to be in accord with experimental results. The Occam Hair Transplantation procedure, as it were.
Because "admitting" that will lead you to the interesting idea that action can happen backwards in time and fuck you up six ways to sunday (just change the reference frame), so you better let it drop.
Yes. It's a choice between "yes, that's very weird", and "sure, OK, screw causality". Once you throw out causality as an axiom, all your physics - and everything else - is rather suspect. You're now in the realm of the supernatural.
Just to clarify another comment. If you allow action at a distance then it is essentially easy to bargain this into action backwards in time. 'Essentially easy' means 'it would be reasonably straightforward to build a machine that did this, using nothing more than extremely well-tested aspects of special relativity.
That's a time machine: a machine which can send information into its own past. Mainstream physicists tend not to like time machines for all sorts of reasons. But perhaps this is all just the usual mainstream science conspiracy?
Well, if you can build a time machine – even one which can send information only a very tiny distance into the past – you can do something interesting: you can bet on the movements of prices in the market, and win, every time. If you can build a time machine *you can win the stock market*: there is no real upper bound to the value of such a device, nor of how much people who make their money betting on financial markets would be willing to pay.
So if you think that your theory with 'long-range order' – aka action at a distance – is actually right, you should be selling it to the investment banks, not writing articles in The Register. I can only assume that the reason the people who support these theories don't do that is because they don't understand enough about physics to realise what the implications of their ideas are.
Needless to say, am I a big fan and supporter of the grand concept, and of the exemplar, Nobel prizewinning physicist Gerard 't Hooft who argues that we may be in a virtual universe, in which all the fundamental particles are virtual particles, like Conway's gliders, running on an underlying mesh of automata …… for whenever the truth of matters is spun differently to create a fiction to conceal the facts of honest and crooked realities, is life as hosted and conveyed by media and computer communications, nothing greater than just a virtual exercise in which intelligences entangle for the power and energy that has belief giving real strength to ab fab fabless imaginanation …….. What's it to be? Western Confection or Eastern Delight
And what is it to be? A star jewel which delivers light from the East or clouds bursting with enlightenment and phormed in the West?
Who let you out again you fucking loon! ..... Anonymous Coward
Hmmmm? Well? What can one truthfully say about that as useful as a chocolate teapot type contribution, other than, ..... "Spoken like a true Anonymous Coward, AC, and thanks for absolutely nothing of value, either real or virtual. You'll go far on the road to nowhere meaningful and worthwhile"
I have not ignored the fact and possibility though, that one may be ill and in need of specialised attention. It be cold comfort to know though, that one be not alone in those crowded fields of sad and/or bad and/or mad and rad existence.
And here is why it is not good to be joined at the hip in a special relationship as a parroting puppet supporter of Uncle Sam and FTSE spinning City economic miracles, for they are all basically worthless virtual reality constructs easily pricked and burst nowadays ...... http://www.zerohedge.com/news/2015-03-09/us-lurches-towards-default…-again
Have a nice day, y'all.
And, is that you, George[Osborne]/David[Cameron], lurking here as a Cowardly Anon? :-)
Stranger things than that can happen today, let me assure you.
I'd just like to be on the record as saying, that I thoroughly enjoy reading your contributions to whatever topic you pronounce upon. I try to find pattern in the verbiage and do find occasional lucidity. Please ignore the vulgar cretin ACs and don't stop posting.
I've written it before and will probably do so again till I get downvoted to the point of being banned, but there are parts of theoretical physics that look increasingly like a religion, only without the social work.
Up till now all the ideas that have expanded the universe have been the results of experiment or observation that discredited the previous ideas. Copernicus argued mathematically for heliocentricity, then Newton's gravitational theory showed the fundamental flaw in a geocentric universe. Better and better telescopes expanded the universe and gave it an age.
And then suddenly, because of problems in a class of theories which were not experimentally testable, we had the multiverse. String theory and the multiverse go against Bill Ockham because there is no a priori reason to believe that there is anything necessary about a theory that can't be experimentally tested. So it can be massaged to align to observation. That's exactly how Ptolemy's epicycles worked, and they were a dead end. The geocentric theory was essentially religious - man as the centre of everything. String Theory is getting rather like Hinduism or Jainism, with their infinite numbers of cycles of universe creation and destruction, and the world itself being an illusion, a virtual world in the mind of Brahma.
So when someone comes along with quantum loop gravity, or this author's work, my immediate reaction is that it needs serious attention because it wants fewer entities and is not religious.
The British government proposing to spend $200 million on quantum computing, presumably at the behest of GCHQ, is just the final nail in the coffin.
Metaphors, while being a useful way for help people conceptualise new ideas and mechanisms, have limitations.
For instance the whole "is light a wave or a particle" debate was only ever useful to a point. because light is neither particle, nor wave, nor both. It is its own thing which may or may not have parallels in other physical phenomena.
So, while light may act similarly to waves on the surface of a liquid, there is no direct correlation.
Similarly, if someone finds a physical model which mimics many of the facets of quantum mechanics, that doesn't mean that it can be used to predict or prove how things work in the quantum world.
What it does do, however, is give people new avenues to explore in their heads, helping them dream up new ways of joining together old ideas.
(enjoyed the article, by the way. It was dumbed down enough for me to understand what was going on, while allowing for further reading. Maybe needed more pictures?)
QM theory was developed to describe phenomena that could not be explained by classical means, so the thing I take from this is that having a physical model displaying features previously only visible in the QM world removes (some of) the need for a quantum specific theories and moves us closer to a theory that joins both worlds. If, in the process, we happen to get rid of some of the 'dafter' aspects of modern physics (cough, strings, cough), then all the better.
Im too lazy to pursue the classical handwaving into adequate equations, but I notice the FAIL at the last sentence:
And if reality is analogue all the way down, then quantum computers are just analogue computers, so their failure to deliver magical results is unsurprising. In fact, we'd rather see it as evidence that the emergent quantum mechanics research community may be on the right track.
Their "failure to deliver magical results is unsurprising", really?
First of all these results are not magical and of course they are analog. The "failure to deliver magical results" has to do with adequate production processes. No-one has yet said "that's odd" because the machine magically fails (which would be interesting). It isn't even big enough yet to exhibit such an interesting effect.
Not so long ago it was not at all clear that large digital machines could be constructed because errors due to stray voltages and flaky vaccum tubes may well propagate and swamp the delicate computation of the state machine. Amazingly, it was all solved and no-one except overclockers give this problem much thought today.
Also, Scott Aaronson in Collaborative Refutation.
"Third thought: it’s worth noting that, if (for example) you found Michel Dyakonov’s arguments against QC (discussed on this blog a month ago) persuasive, then you shouldn’t find Anderson’s and Brady’s persuasive, and vice versa. Dyakonov agrees that scalable QC will never work, but he ridicules the idea that we’d need to modify quantum mechanics itself to explain why. Anderson and Brady, by contrast, are so eager to modify QM that they don’t mind contradicting a mountain of existing experiments. Indeed, the question occurs to me of whether there’s any pair of quantum computing skeptics whose arguments for why QC can’t work are compatible with one another’s. (Maybe Alicki and Dyakonov?)
But enough of this. The truth is that, at this point in my life, I find it infinitely more interesting to watch my two-week-old daughter Lily, as she discovers the wonderful world of shapes, colors, sounds, and smells, than to watch Anderson and Brady, as they fail to discover the wonderful world of many-particle quantum mechanics. So I’m issuing an appeal to the quantum computing and information community. Please, in the comments section of this post, explain what you thought of the Anderson-Brady paper. Don’t leave me alone to respond to this stuff; I don’t have the time or the energy. If you get quantum probability, then stand up and be measured!"
Beer to that, Scott.
The analog substrate which supports the computing mesh, or whatever we perceive as quantum mechanics, will be subject to some form of ultraviolet catastrophe, except we probably won't be able to conduct experiments on it, unlike renormalization in Yang-Mills QFT. So the theory may not be falsifiable.
Of course we can always resort to supernatural hand-waving to dismiss such concerns - infinite mind of God, turtles all the way down, etc.
You don’t get to replace the precise predictions of QM by slippery verbal reasons-why-you’re-not-yet-proven-wrong that change from one experiment to the next. Instead, you need to replace QM by an alternate mathematical theory that
(1) also describes anything that could possibly happen to a many-particle quantum system (not just one particular thing),
(2) agrees with all experiments that have already been done, but
(3) unlike QM, does not require an exponentially-large Hilbert space.
YOU HAD ONE JOB. "Quantum computers have failed. So now for the science" doesn't even manage to get the pants up.
Your comment about large digital machines is incorrect in that first noise wasn't the issue and second the reliability problem was addressed quite quickly, beginning I think with the EF50 in the UK. Noise became a problem with the first drum storage but by then the machines worked. There was no argument about whether they were actually computing.
Quantum computing is proving a very intractable problem. It may be proven workable but the boring old gate based stuff seems to winning the race at the moment.
So I suppose that instead of looking for "god in the machine" we should be looking for "god in the virtual machine"........I threw that phrase into google translate to see what the latin would be and got......
"deus in rectum apparatus"
is someone trying to tell us something about experimentation?
we're back to waves travelling in an invisible, undetectable aether? That doesn't sound so hot, but when the other mob have to resort to Dark Matter AND Dark Energy and still wind up with multiverses, the funny old aether doesn't seem so out of order.
Enjoyed the article a lot, but as Homer Simpson said...
There is a simple problem with the aether; it was disproven by the Michaelson-Morley experiment, and indeed if it exists bang goes relativistic physics.
Aether needed to be a vector field to work. That doesn't rule out a scalar field that permeates all space if there is no preferred centre or direction, but it wouldn't be the aether.
Dark matter and dark energy aren't a conceptual problem - the idea that there exists mass and energy that do not interact with the EM field is no odder than the proven existence of neutrinos, which almost don't. If you had a dark room whose floor was covered in red, green and blue balls mixed in with a lot of matt black balls, and were allowed to investigate by shining a narrow beam of light into the room, it might take some time before you discovered the black balls. Neither of them implies a multiverse.
Quantum computing always impressed me as vaporware. However, your throwaway line "...but it would just be an analogue computer, so you couldn't expect any magical speed-up." really has no moxie behind it. I think that analog should take off because we already have experience in it, both as man-made, and our own brains. We only started all that binary digital computing cock-up because it was so EASY.
Animals must decide discrete sequences of actions to perform in response to their immediate environmental situation. They cannot flee, fight and remain motionless at the same time. There must be positive feedback or attractor state mechanisms in the brain to cause switching. Well, switching is a digital term. That is not to say the brain is a digital CPU. Rather it is a digital recall system. Recalling required sequences of actions, remembering which worked out well and which did not and therefore updating itself.
I have reasons to believe the human brain is about 100 Gbytes. Instead of IQ I think it would be better to measure the amount of Gbytes a person is. So a person with an IQ of 100 with a lot of experience in the world would have more Gbytes than a person with an IQ of 130 who had spent his or her whole life working the fields or sell tea on the side of the road in a developing country.
To quote the MIT quantum computing expert Scott Aarsonson..
" The hydrodynamic models still can’t correctly reproduce the effects even of two-particle entanglement, Bell’s inequality still explains precisely why no such model will ever be able to do that, and the advocates of the models still can’t formulate a way around Bell’s inequality that makes the slightest bit of sense to those who understand it. Which means that there’s no need even to discuss quantum computing; the proposed classical description of our world fails way before we get there. Move along, folks."
Are the people who are voting down saying that the quantum computing MIT professor is wrong about quantum mechanics? If so, why?
My understanding is that Brady and Anderson have been trying to claim they have reinvented quantum mechanics for a few years now and all respectable physicists who have looked at their claims say it is nonsense.
I've been fascinated at those silicon droplet experiments too. It makes you look at all the old experiments into Aether. All cool stuff. I too expect that there is something that we are missing that will move us into a more classical interpretation.
The flow on effects into quantum computing....
You have to know what information your missing, and that is the crux of the issue ;-)
As the Schrodinger equation is a wave equation, it is not too remarkable that some of its behaviors can be replicated in other things that involve waves. If, however, you ever try to do the Bell's inequality experiment with your droplets, I rather think that some differences will come out - indeed, I suspect that they will manifest themselves earlier, such as when you try to "observe" those droplets, thus forcing wave packets into eigenstates.
The wave-particle duality is key to much of the strangeness of quantum mechanics. Yes, you can simulate the wave part, but the particle part, that leads to "quantization", is not so easy to match with a classical model.
So if I read the article correctly, the existence of a quantum computer would serve as experimental refutation of this theory. These days it's good to find such a simple experimental test of a new theory in physics - and such a well funded one too!
Personally, I'd have to agree with Richard Feynman's original observation that, essentially, the universe has to be able to compute a lot faster than our classical computers do or it wouldn't be able to work fast enough itself. Quantum computing is just a way of harnessing the computing that goes on around us all the time - and it's staggeringly fast.
How could quantum computers connect Qbits using our usual crafting abilities ?
To be usable, this has to be built in some extended dimensions which we still cannot kraft.
By the way ... did someone try to put half of a very small ball of nuclear fuel waste (especially U238) in a pretty good laser boiler (like the NIF for instance) while recording the tap tap made by a Geiger counter surrounding the other half ? (the ball would be made of a small vaporized/condensed part of a precise point of a used nuclear fuel stick)
Entangled U238 atoms would be interesting I guess.
hi RA think this is dead on but it could take many, many years for the mainstream physics community to accept some of these ideas. it represents an massive structural/ kuhnian paradigm shift. QM "interpretations" have been settled by the mainstream (or rather managed to be effectively/ practically pushed aside) for nearly an entire century at this point by the so called "shut up and calculate" contingent.
recently wrote up a lot of related info/ survey at this link
see esp the very recent paper on Pilot Wave Hydrodynamics by John Bush at MIT cited there. are you in contact with him?
also another really great recent book on this is called "how the hippies saved physics" by kaiser. it shows that bells thm was not even noticed/ accepted much by the mainstream establishment for many years & was given attn by a small band of iconoclasts/ contrarians/ minority etc.
encourage anyone interested to join stackexchange for lively chats on this & related subjects (eg physics or computer science sections etc).
UK government invests in Transputer. That one didn't work out either. It's unfortunate because there is a lot of real science pouring out of UK labs. To the point where the UK is only able to commercialize a fraction of it. Most other countries are not in that situation. You are throwing away your advantages. If you are talking about magic cats and the like then not enough is know about the basics of the subject to do real work with it.
Biting the hand that feeds IT © 1998–2020