Speed
Computational speed tends to open up areas in computing that were deemed to difficult to implement because of the available processor speeds. Though with most tasks the limit is often not hardware but wetware.
Two years ago Google and NASA bought a D-Wave 2X quantum computing system and the Chocolate Factory has now pronounced itself very pleased with the results. "We found that for problem instances involving nearly 1000 binary variables, quantum annealing significantly outperforms its classical counterpart, simulated annealing. It …
"Computational speed tends to open up areas in computing that were deemed to difficult to implement because of the available processor speeds."
Indeed. But unfortunately some people seem to believe that quauntum computers can magically solve some problems that were intractable to von neumann machines not only in time but in principle. Well they can't, at least not in this form. They're just very very fast parallel computers. Also quantum computers can't help with any problem that is fundamentally serial in nature so the standard CPU isn't going away anytime soon. I suspect , initially at least, quantum chips will be embedded alongside standard CPUs as co-processors and then be incorporated inside them like GPUs were. I wouldn't want to be the person who has to code their libraries in whatever wierd assembler they'll use though.
This post has been deleted by its author
The brass ring in this exercise is to have sufficient quantum computing grunt to attack extremely laborious mathematical tasks, Such as the "traveling salesman" problem, and factoring very large primes. The second one would render our current world wide public key encryption system worthless. I suppose there would be some benefits tho.
Regular silicon cannot handle these problems without extremely many clock cycles, so many as to make it impractical. Quantum computing in theory would try vast numbers of possible solutions all at once, in a magical quantum sort of way. Sounds too good to be true, frankly.
The brass ring in this exercise is to have sufficient quantum computing grunt to attack extremely laborious mathematical tasks, Such as the "traveling salesman" problem, and factoring very large primes
TSP is in NP-Hard and probably not in BQP. Approximate TSP (ATSP) apparently is in BQP, but then there are all sorts of conventional approaches to ATSP, so there may not be any point at which a QC becomes economically superior to conventional machines for ATSP in practice.
Factoring very large primes is trivial. Factoring the products of a couple of very large primes is another story. The precise complexity class of factoring is unknown, but it is in BQP.
Regular silicon cannot handle these problems without extremely many clock cycles, so many as to make it impractical.
Well, no. Factoring on conventional systems has gotten quite a lot easier over the past couple of decades, and no one knows how much easier it might yet get. That's why NIST keeps raising the length requirements for RSA keys.
Obviously you can pose a factoring problem large enough that no conventional computer can ever solve it in reasonable time (based on such fundamental criteria as the number of atoms in our Hubble volume), but then you can do that will all sorts of problems that are in P. Big problems with anything worse than constant-time solutions are big.
Quantum computing in theory would try vast numbers of possible solutions all at once, in a magical quantum sort of way.
Not really. The best-known algorithms in BQP - Shor's and Grover's - aren't that hard to understand, really, and while they make use of superposition it's not a superposition of "possible solutions" per se.
Also, the DWave machine is doing quantum annealing, which isn't the same thing as straight-up quantum computation. It doesn't implement BQP algorithms directly. Instead it makes use of quantum tunneling - as in a tunneling transistor or a tunneling electron microscope - to take "shortcuts" in escaping from local minima while trying to find the global minimum of an objective function. The Wikipedia page has a useful diagram, for those who don't want to puzzle through the full explanation.
Personally, I remain unconvinced about DWave's claims, despite this latest bit of cheerleading from Google. Other teams have published results showing they were able to outdo DWave equipment on problems that DWave should be particularly suited for.
'In other news, Google's new sausage making machine proven to be able to make sausages 10 billion times faster than a current state of the art espresso machine.'
Given the pressure at which an espresso machine operates, I suspect it would have a real chance of outpacing a traditional sausage maker. So long as something akin to 'meat silly string' meets the definition of sausage.
>Given the pressure at which an espresso machine operates, I suspect it would have a real chance of outpacing a traditional sausage maker. So long as something akin to 'meat silly string' meets the definition of sausage.
I can't be alone in thinking that the Reg Spesh Projects Burro should be taking this as a call to arms!
It's a hundred million times faster at calculating the inputs required for a complex quadratic equation (one of those things you did in school, y=2x+7 but complex) to give the smallest possible output.
It does strike me as an odd benchmark to use, almost like they're cherry-picking,
It's only fast for a very limited class of problem. We still are not sure what is in the box. It's not an actual quantum computer, but obviously different to a conventional CPU, GPU or possibly different to a FPGA implemented specialist processor.
Hence Google calling it "Quantum Annealing"
It's only ever been claimed to be a quantum annealing device. The debate has been whether it's actually performing the quantum part or just performing "unsimulated" annealling on an analogue computer. Both should be faster than simulated annealling, maybe this confirms the quantum part, which i believe refers only to quantum tunneling effects.
It still doesnt answer the question: is it faster than the best classical algorithm because the test the picked was a guaranteed win if it works at all.
@ WillbeIT
> sorry, not a geek
Don't worry! Quantum physics in general has confused and upset our greatest physicists - though real quantum effects have been used in what is now everyday technology. The jury has been out on this D-Wave machine for a little while, because of the difficulty in proving, or devising tests to prove, that it is faster than the 'classical computers' we use everyday. What D-Wave have never claimed is that their machine can perform Shor's Algorithm.... And this is important. Let's pause here a moment.
Quantum systems can exist such that all their particles are both ON and OFF at the same time ( I'm grossly oversimplifying here), so if it has enough 'quantum bits' (qbits) it could calculate every possible answer to a question at the same time, instead of trying one answer at a time like a normal computer would. The implications for breaking encryption are huge.
Our encryption is based on the difficulty of factorising very big numbers, but an algorithm for doining so quickly using (theoretical) quantum computers to do so has existed for years, and it is called Shor's Algorithm.
What D-Wave claim, and what Google believe they have a use for, is that their machine can find optimisations in quadratic equations.
tl;dr A future quantum computer has the potential to massively upset our computer encryption, but not one based on this machine. There has been a lot of debate amongst academics as to what exactly this machine is doing, but Google - the customer- seem happy with it.
versus chucking a model of said surface out in the rain for a few minutes and simply see where the water pools...?
You really want to find the global minimum, and your method finds all the local minima as well.
The whole point of quantum annealing is that it uses tunneling to exit local minima faster / easier (i.e. with less perturbation).
Of course, as someone else pointed out above, some critics think the DWave machine is just doing straight-up non-quantum analogue annealing, which means it's doing the electrical equivalent of what you described, more or less.
My two outdated currency tokens:
The qubits in question are basically hyperactive random number generators in this 'quantum' computer. They're way faster than any conventional RNG, but still need serial output to be useful. It's not so much 'both 1 and 0' as it is 'everything between 1 and 0 inclusive at once' until measured.
It's the measurements that no one really understands, but they do produce essentially instantaneous and quantifiable random values that are always either a 1 or 0. (The probability of getting 1 or 0 depends on the state it's in first; the closer it is to 1 or 0, the higher the chance of getting that value.)
Instead of using conventional RNG circuits that need a seed and computations, this makes it much faster to compute random numbers and run them into a normal computer; generating 01110001 01110101 01100010 01101001 01110100, for example, needs 8 qubits, and the more you have, the more operations you can run at once and with larger numbers.
You still need normal hardware to process it, but (D-Wave's computer) is pretty much a black box that spits out unpredictable values much faster as far as anyone is concerned. That's where the speed comes from.
At least, I'm pretty sure that's what's happening.
...Probably.
I do have limited knowledge in this area, but am struggling to understand how it would not help to defeat encryption in a limited way by massively increasing the speed of any type of brute force attack?
The DWave machine doesn't do that. You can't1 create an objective complex function with a gradient that points at the correct key for a decryption problem because it's flat everywhere except at a single point. That is, all incorrect keys are equivalent, so there's no way to move toward the single minimum.
The DWave machine is not a quantum computer. At best, it's a quantum annealing system. You can't implement Grover's algorithm on it, and Grover's algorithm is the best possible QC algorithm for finding a (symmetric) encryption key.
In any case, Grover's only gives you a square-root improvement on the work factor, so doubling the size of your key puts you right back where you were before your adversary fired up his quantum computer.
1I await being proven wrong.
What's the trade off?
Probably that you have to phrase your problem in a way that's suitable for simulated annealing or Markov chain Monte Carlo. So it's for difficult optimization problems (do they need to be smooth?).
A bit between chemistry (thermodynamics) and mathematics :-)
Now if you can rewrite the training of a neocortical column as a simulated annealing problem.. how many rat brains is that, on current hardware?
If this works as planned, does this mean that gadgets soon built with this kit will run 100 MIIIILLLLIIIOOONNN times faster?
No. The numbers are being fudged and inflated. The real figure is probably more like 50 million times faster under ideal circumstances. And even at that we're decades, at least, away from consumers having access to quantum processors. This is the equivalent of a Cray in the 80s.
wtf? whats the trade off?
I'm not up on current developments so here's your grain of salt, but I believe quantum computing still requires superconductors. Last time I looked into it they still hadn't figured out how to make superconductors that operated above 90°K (which is really, really cold). That being the case both the quantum processors themselves and the operating costs are going to be astronomically expensive.
"...tests had shown that existing hardware will work for certain tasks much faster than anything else on the market. How broad a range of tasks is something that all the cryptanalysts at all the world's intelligence agencies will be spending a lot of time finding out."
Well, yeah. Of course.
"...How broad a range of tasks is something that all the cryptanalysts at all the world's intelligence agencies will be spending a lot of time finding out."
No, they won't. Quantum-computing cryptanalysis is pretty well understood, and quantum annealing doesn't do much for cryptanalysis.
This remark is the usual bullshit about QC and cryptography.
Real, large-scale quantum computing - which DWave isn't even claiming to be - poses some challenges for cryptography, primarily in asymmetric encryption. It's not a big deal for symmetric encryption, and for the areas where it is troublesome, we have a whole raft of proposals for "post-quantum cryptography". This story is pretty much dead before it even became real.
Simulated annealing is a technique for optimisation technique for computers, so named because it was inspired by exactly what you mentioned, annealing of metal to soften it.
SA is hill climbing, but with the addition of probabilistic moves in directions that lead to worse results. This allows SA to escape local minima - in a sense it is "softening" the function being optimised, smoothing and blurring it to allow the global optimum to be found.
http://arxiv.org/abs/1504.07991
Tail of the time-to-solution distribution is considerable fatter with quantum annealers for some reason versus classical simulated annealing. Its a promising approach, worth being aware of but not yet worth shelling out for on its own merits.
I think the real test they should be running is comparing power consumption. The total power consumption of the D-Wave machine is far greater than a single processor chip. Additionally the D-Wave isn't a general purpose computer so perhaps the best comparison would be against the best ASIC for solving the given set of problems. The fight would therefore be: D-Wave vs Same total power consumption group of ASICs. If there's still a benefit to using D-Wave then they are on to something good. If this was BitCoin the current comparison would be Desktop PC vs ASIC miner.
This post has been deleted by its author
I do wonder how much speed increase one might achieve by simply designing a completely new architecture based on modern design techniques. As far as I know, most computers we use these days are highly refined versions of the Von Neumann architecture - Quantum computers are a different beast so they have exactly that opportunity leading me to wonder whether this is part of why they show some impressive jumps in pace.
If you can afford to buy and run one of these super-cooled monsters, I suspect you don't need to worry about mining fake plastic tokens ...
> If so, what would be the currency implications of a breakthrough where someone would suddenly out speed everyone else for a while?
Everyone would realize that they entire crypto current scam ^H approach is a ponzi scheme designed to make the guys who got in early rich, and go back to using normal money?
"
Everyone would realize that they entire crypto current scam ^H approach is a ponzi scheme designed to make the guys who got in early rich, and go back to using normal money?
"
On the contrary, it is traditional currency (most of us no longer use money, only currency) that is akin to a ponzi scheme, with the government controlling the scheme and reaping the rewards. Bitcoin OTOH, if it were to be widely adopted, fulfils all the criteria needed to be real money (inherently limited, durable, portable, divisible). The downfall of currencies are that they are not inherently limited - governments can create as much if it as they want very quickly and with no effort, and all have done so to some degree or another. Bitcoins, like gold bars, can only be created at a slow rate and with a large effort.
Every time the total amount of currency in existence is increased, it transfers a proportion of the value of the currency in your bank account into the pockets of the government - which is the ultimate stealth tax because you do not see any loss in the amount of currency you have, only a loss in the value of the goods it is able to buy (which amounts to the same thing but most people don't see it for what it is).
> And this isn't true for a bitcoin-like mining scheme because?
It is a good idea to become at least minimally familiar with a subject before passing an opinion. As the critic that you are, I would have expected you to know the answer to that question.
The total amount of Bitcoins that can be generated is limited by design. You could of course change the algorithm, but then it would no longer be Bitcoin but some other thing. There are indeed a number of "digital currencies" about, I believe.
For the record, I have no opinion or interest in so-called digital currencies. For that matter, I have no opinion and very little interest in so-called traditional currencies.
Dumb ass question but are these types of quantum computational algorithms applicable to Bitcoin-style computational mining in theory?
Quantum annealing isn't, as far as I can tell, so DWave (or D-Wave or whatever they call themselves) are out. And I don't see any obvious application of any of the algorithms I know of that are in BQP to cryptocurrency mining.
In any case, as other people suggested, the economics don't work. The DWave machine is expensive enough; as far as we know, no one's built a real QC of any size, for any price.
This post has been deleted by its author
Yes I read the paper, and it is interesting, but not that exciting. If it were quantum, I would be the first to congratulate them.
However the 108 claim is absolute rubbish. We have machines with at least 106 computing elements, so why not compare their megafridge against THE BEST traditional machine? Not some weedy PC!
This is the reason why the TOP500 persists. It is too easy to fudge a benchmark your codes does well on that nobody has heard of.
I am not impressed (yet) and I even sat through their sales pitch....!
Is it possible Google and NASA are just trying to justify buying a very expensive fridge....?
P.
This.
I would also like to know whether it is actually using Quantum Effects in there or whether it is, in the end, just a special-purpose ANALOG COMPUTER (i.e. a physical system that models the problem you want to solve). Because I remember papers that were disputing the presence of Quantum Effects in a D-Wave in particular.
The abstract at least nicely elides the question.
Analog computers being millions of times faster than symbolic processing machines at a given specialized task is not a particularly surprising finding.
Classical simulated annealing is not a particularly good optimization technique which was acknowledged in the paper:
"It is often quipped that Simulated Annealing is only for the 'ignorant or desperate'."
There are much better classical optimization technique which would compare much more favourably but they used it because it was the closest analogue to quantum annealing.
Simulated annealing also doesn't do well with potential energy surface which contain deep, narrow wells where it can get trapped. Quantum annealing can tunnel very efficiently through them and so they chose a problem which suited it. From the paper:
"carefully crafted proof-of-principle problems with rugged energy landscapes that are dominated by large and tall barriers"
This post has been deleted by its author
A good way of thinking about this machine is trying to open a regular pin padlock.
The traditional approach is to sequentially try a key with every different depth of cut in every position.
The quantum computing approach is a "bump key", where you simultaneously sweep across all cut depths on all positions.
The sequential approach can take a very time. It's trivially parallelized, although you are just one-for-one trading hardware for time.
The quantum computing approach is almost instantaneous, but you are never certain if you have found a correct solution. You always have to test the result to see if it works, and even then repeat a few times to see if the result is the same. If you consistently come up with a bad result, you need to reformulate the problem to specifically exclude that result and try again -- much like a lock pin always getting suck on a nick.