
I for one
Have no idea what this means.
IBM has opened a quantum computing centre in Poughkeepsie, New York, which adds 10 quantum systems to Big Blue's fleet. The company claims its customers have run 14 million experiments using its quantum cloud computers and published 200 research papers since 2016. "The fleet is now composed of five 20-qubit systems, one 14- …
I remember someone from IBM at a big tech conference ~20 years ago taking about how they'd got something like a 4-bit quantum computer working and they were planning to scale it over time, and added that once it gets to a certain size then all current encryption techniques were out of the window. A conspiracy theorist might say that the he TLAs have already done this scaling and are paying IBM to move forward very slowly to give the impression that its very difficult or perhaps impossible
Not *all* current encryption techniques. Symmetric cryptography (where both side share a single key) only has to double the key-length to achieve the same security. So AES-128 => AES-256, and everything is good.
Asymmetric crypto (where there is a public key which you tell the world, and a corresponding private key which you keep very very secret) is a bit sadder. All the current algorithms are indeed (technical phrase coming here) completely buggered by a sufficiently large quantum computer.
Also depends how long it takes to setup the quantum computer. Is it quick setting up a 1028bit key cracking multiple qbit computer? Running it a few million or billion times to confirm your probabilities and that there is no noise?
But yes, a perfect and quick to setup quantum computer can crack quite a few types of encryption quickly. IIRC there are though some types that take a similar time to unravel as on standard turing machines. So changing algorithm is possible (but currently not worth the overhead as there is no risk or need).
Unless you've taken a copy of every email, transmission or communication over many years abd then can decrypt that content at your leisure.
Now that the majority of encryption, by volume (i.e. HTTPS to the most prolific websites) is using PFS, you'd better have a lot of leisure time. Even with Shor's algorithm, breaking all those session keys would not be fast. Poly time is bad for cryptography, but it's naive to think that a large GQC will decrypt an enormous volume of stored traffic by magic.
a perfect and quick to setup quantum computer can crack quite a few types of encryption quickly
Not really - at least not the "quite a few types" bit. Basically it's RSA, DH, and DHE; and anything with a key that's too short. Those are commonly-used algorithms (well, maybe not the last, these days), but it's a short list.
IIRC there are though some types that take a similar time to unravel as on standard turing machines.
Entire families of algorithms, in fact. Lattice, multivariate, code-based, and supersingular-isogeny are the main ones.
So changing algorithm is possible (but currently not worth the overhead as there is no risk or need).
Actually, it's already underway. NIST and other standards bodies are running competitions for viable "post-quantum" algorithms. ("Viable" means they offer reasonable performance and key sizes, and have been sufficiently analyzed without their security claims being unduly undermined.) Google and others have created test TLS suites using some PQ algorithms and done some testing with them in the field. In fact, it looks like Google and Cloudflare are conducting experiments with PQ suites in TLS (specifically CECPQ2 and CECPQ2b Kx) now.
We will almost certainly have PQ crypto in widespread use (courtesy of public HTTPS with major volume players like Google participating) long before there are any large general quantum computers known to the public. My belief is that it will happen long before there are any large GQCs, public or secret, in existence. I won't be at all surprised if there are no large GQCs in my lifetime.
All the current algorithms are indeed (technical phrase coming here) completely buggered by a sufficiently large quantum computer.
Not true. All the asymmetric-cryptography algorithms (both for signing and for key agreement) in widespread use rely on problems in BQP - specifically, they can be solved in polynomial time using Shor's algorithm with a sufficiently large general quantum computer. But quantum-resistant asymmetric-crypto algorithms date back to the 1970s (McEliece, Merkle, etc).
In their original forms, the quantum-resistant algorithm families all offered too high performance penalties and/or key sizes versus RSA, discrete DH, and ECC DH. But this has been a very active area of research, and of course computing resources are somewhat more generous now than they were forty years ago. Google did a trial of some "post-quantum crypto" in real-world use a while back; the algorithms are good enough now to be used by sites that don't need the absolute best performance.
For symmetric cryptography, Grover's algorithm offers an exponential speedup in brute-forcing, but the exponent is only 1/2 so doubling the key length negates the advantage, even if operations in your QC were as fast as they were on your conventional computer. Which they wouldn't be.
And, frankly, it's not just a matter of "scaling up"; there are good reasons to doubt that large general QCs are even feasible. If they are, they'd be so wildly expensive and resource-hungry that even state actors wouldn't be able to use them for routine key-breaking.
It doth seem to me that quantum computing is effectively gender-fluid computing, 0's are 1's and 1's are 0's simultaneously. So as above so is below. The harbinger of an AI that will consume humanity.
Such is one path humanity may take here at the threshold to wit is 2019. Democracy, as Mr. Madison warned only matters when it is the policy the elite through Manufactured Consent desire, so as we see with Brexit, and Trump. However, it is just a theory, thus is the burden of the apple, the knowledge of good and evil.
However, perhaps there remain a remnant, those who favor the dignity of the λόγος over the collective, i.e. the "Hive Mind" of the vaunted 'singularity.' Those adherents who are true winter soldiers in the Glorious Cause of Liberty too present a path laid out before us as a choice.
The time to choose draws near.
But I thought useful quantum computers were still some ways off on the horizon, and were currently squarely in the realm of Mad Scientists. No? Or are these IBM machines still considered "experimental" and not really available to the general public? Or is it marketing hype? We had this story back in December https://www.theregister.co.uk/2018/12/06/quantum_computing_slow/ where Uncle Sam was saying quantum was still years away. I admit, I am a tad confused.
Exactly. These machines are "available to the public", for some definition of "public", and have "practical" applications if your practice happens to be "let's see what we can do with toy problems using GQC, so we have a better idea of what we might be able to do with a large GQC". (Though, to be fair, even at 50 qubits you ought to be able to formulate problems that are intractable on classical computers; 250 is a respectably large number.)
But you're not going to use these machines to speed up your database searches with Grover's algorithm, or to crack 2048-bit RSA keys.
The big problem with quantum computing is dealing with noise and loss of coherence. The solution to this involves (very) advanced boffinry that I almost completely fail to understand but *basically* you take thousands of physical qubits and combine them to make one logical qubit which is much more resistant to these problems.
Does anyone know if these counts (for example: 53 qubits) are of logical qubits or physical ones. If it's physical ones, RSA-64 is looking a bit dodgy ...
IBM claims a quantum volume of 16 for one of their "20-qubit" machines. That's probably a more meaningful measure of GQC processing power than qubit counts. The advantage of QV is that it's a concrete problem that's sensitive to the sorts of problems that commonly crop up in QCs, so it measures how large a problem can feasibly be solved by an actual machine.
QV is 2 to the power of the length of the edge of the largest square random circuit the QC can model accurately (see link above for details). So the 53-qubit machine, assuming it works as well as the 20-qubit one, might show a QV of 32. Though I note the original QV paper suggests that it's only an appropriate measure up to about 50 qubits.
In any case, the QV of 16 for IBM's five "20 qubit" QCs implies that those are 20 usable qubits - that they can be used for problems of that size.
You can break RSA-64 by naive brute force, so I'm assuming that last bit was a joke.
I struggled with getting my head round this one, and how much is theory and how much reality. I was hoping a commentator would jump in.
This video helps - an IBMer explaining quantum computing to a child, a teenager, an undergrad, graduate student and a professional. I got lost half way through undergrad...
It also shows just how mad the machines look - known as chandeliers apparently.
https://youtu.be/OWJCfOvochA