Britain might take the lead in quantum computing
And it might not.
Oxford boffins have vowed to have the largest quantum computer ever made up and running within five years and help Blighty regain its place at the top of the tech world. The government has announced the creation of four "Quantum Technology Hubs" which will collaborate to build a small device called the Q20:20. Sadly, this won …
We'll invest, invent, and sell off to the highest bidder, who won't be British, because, lets face it, who in UK industry has the capability to market and manufacture game changing computer technology.
Whatever the technology will land up in Chinese hands for manufacture, it just remains to be seen who'll buy the IP. Intel?, IBM?, ARM won't be able to afford it.
As it's not x86 Intel will bury it for sure.
Perhaps in five years time the many other attempts at quantum computers will show results.
The British experiment might get usurped by a cheaper alternative.
The Q20:20 may just quietly move to Cheltenham and fill a purpose the government would be hard pushed to justify to the public,
"We're investing millions in developing a quantum computer to watch you all" sounds like a step too far.
"We're investing millions in developing a quantum computer to place British ingenuity back on the plinth it should never have fallen off" sounds a lot better, even though the end result will perhaps be the same.
It seems to me that quantum machine learning could build a system like Gordon Way's Reason. Ie. you feed in all the relevant data, and you train the quantum computer with the answer you want, and it generates the simplest way to get from the data to the answer...
There are at least three places in that funding proposal where magic is supposed to happen.
But please do go on.... but insist more on how exactly the magic quantum sauce will be better than bog-standard classical physics at empowerating the computing process (which is just in the BQP complexity class, remember?).
It doesn't have to be better than classical physics at actually doing machine learning, just better than humans at finding the model you need to train in order to show that data -> answer.
I'll take some magic quantum sauce on my quantum chips please :) Hopefully, the quantumness will ensure exactly the right amount of sauce per chip. Or at least demonstrate that the amount of sauce on that particular chip is exactly the right amount, with workings.
Quantum computer emulators exist, but you don't get the efficiency.
It's a bit like having to emulate multiplication by adding number a+a+a+a+....+a. You get the right answer, but it's **slow**
Quantum computers use superposition, which is a little like white-light being made up of different colours all doing their own thing, so the level of parallelism is going to be another slow thing to emulate.
If you think about it, classical computers are basically just NAND gates glued together, it's not surprising that they fundamentally can't do everything...
Technically, they can be emulated but with an exponential slowdown.
An representation of n qubits for example demands that you store the complex probability amplitude of each of the 2^n possible classical states (this is the "state of the n-qubit quantum system"). There is a differential equation involving a 2^n * 2^n matrix that you have to integrate over time to find that state at time T. Finally you have to make the solution classical (this is easily done though, it consists in projecting the 2^n vector onto various axes to get the classical probabilities, then select a classical solution according to these probabilities). Overall intense on the CPU and memory though.
For infinite-dimensional states (e.g. position of a particle in space where each position is one dimension) things are even nastier.
All in all, a good reason to good back to the good old principles of analog computation.
If you think about it, classical computers are basically just NAND gates glued together, it's not surprising that they fundamentally can't do everything...
Please read up on the Theory of Computation, especially the Church-Turing thesis and on Complexity Theory, especially the classes P and NP and BQP. The latter can be gainfully learned about here.
"The boffins at Bletchley Park were the first to design a programmable computer"
The first programmable computer was produced by German pioneer Conrad Zuse. The functional program-controlled Turing-complete Z3 became operational in May 1941.
He was also ahead of the curve in the field of digital physics. See "Rechnender Raum" ("Calculating Space" in English).
Cryptography using maths will be easy to crack... perhaps.
but 12,34,162:18,45,25 will reveal all to you if you know which edition of which book we agreed on a while back. Or it could just be random noise in an MP3...
If your key can be cracked mathermatically you need to get out more!
Gah.
QC does not ruin cryptography. It doesn't even significantly threaten it. In some cases it offers better time complexity than the best-known classical algorithms, but even if some hypothetical QC machine clocks as fast as the fastest classical computer available, in the worst case you have to double your key lengths to achieve the same work factor.
Might as well worry about sorcery breaking encryption.
I came here to ask that very question, but not as a joke.
My understanding of quantum computers is that they're somewhat equivalent to massively multicore computers, that can generate new cores as needed - although that idea comes from the book Factoring Humanity, which extrapolates the workings of a quantum computer based on the double slit experiment.
If that's right then a quantum graphics card might be better than a quantum CPU, since graphics cards are more transparently multicore to the software that's controlling them.
Here's a (probably over-simplified) outline of quantum computing.
Quantum indeterminacy or "wierdness" means that a particle or small system can exist in two (or more) states at once, until you observe it. that "colllapses the wavefunction". Repeat the experiment for a two-state system, and it'll be like tossing a coin: heads half the time, tails the other half of the time. That's a quantum bit or qubit. Doesn't sound very useful, does it.
What a quantum computer does is to provide a quantum register (N qubits) and the means for performing mathematical operations on the contents of that register without observing it . This is equivalent to performing those operations on every possible N-bit number at once, from 0 to 2^N-1.
And then you observe it, and get just one out of the possible results.
However, consider a sequence of operations that will return one of a given number's prime factors. With 4 qubits, this has been done. Perform the sequence of operations that will return a prime factor of 15, then observe the quantum register, and you will get 5 or 3 with probability 1/2 each. All other possible 4-bit numbers have a probability of zero. They are not prime factors of 15 so they won't ever be observed.
With 32 quantum bits, it's looking rather interesting(*). With 4096 quantum bits, PKI is dead. For a billion quantum bits or a mole (~10^26) of quantum bits, you are performing magic(**). I do not believe that quantum computing can work for large numbers of qubits. Prediction: the upper limit of how many quantum bits one can work on, will tell us something very interesting about quantum mechanics, physics, and the structure of the universe. Of course, it's possible that the universe really is stranger than I can imagine and there is no upper limit....
Cryptography as we know it may or may not survive the experiment.
(*) AFAIK nobody has yet made even a 32-bit quantum computer. But if they had, would they be telling us?
(**)This is akin to finding a fast algorithm for solving NP-complete problems. (***)
(***) which, assuming you tell anyone else, is akin to signing your own death warrant. If you tell only a small number of people, some intelligence agency will take extreme measures to make it their secret. If you manage to spam it far enough and wide enough ... you probably just accelerate the rate at which a strongly Godlike AI bootstraps itself, and takes over the universe. Which given a universe containing billions of galaxies like ours, would almost certainly have happened already were it possible at all. So I predict that it isn't.
With 4096 quantum bits, PKI is dead.
Shor's Algorithm threatens RSA, DH, and their EC variants, but there are other asymmetric ciphers. See Bernstein's "Introduction to post-quantum cryptography".
QC will not destroy asymmetric cryptography. It won't ruin PKI.
Even if someone has a sufficiently large QC to crack RSA and EC keys currently in use, how many are they actually going to get to? Even if QCs that large become common (a wildly implausible assumption), generating new RSA and EC key pairs is trivial, so defenders still have the advantage - only high-value traffic would ever get decrypted. And high-value traffic can move to any of the asymmetric ciphers that don't have known algorithms in BQP.
they're somewhat equivalent to massively multicore computers, that can generate new cores as needed
For sufficiently small values of "somewhat". I'd rate this metaphor somewhere between "not very useful" and "completely misleading".
DAM gave a decent explanation above, and there are others in the comments.
If that's right then a quantum graphics card might be better than a quantum CPU, since graphics cards are more transparently multicore to the software that's controlling them.
That's not a useful distinction. Programming for QC isn't any more like programming for classical GPUs than it is like programming for classical CPUs - since GPUs are CPUs, marketing bullshit aside. Programming for QC is a matter of identifying and implementing algorithms that are in BQP and figuring out how to map your problem domain onto them.
I believe there are free QC emulators around, if you want to see what it's like.