Perfect...
Only need to do it once, right?
https://m.xkcd.com/221/
A team of physicists claim to have developed a guaranteed random number generator using photons and the laws of quantum mechanics. Random numbers are used to secure communications, and a good random number generator is essential for strong encryption. But ensuring that the numbers are truly random is difficult. Number …
What about the RNG a few years back that had an Americium source from an ionization smoke detector firing into a webcam ccd, and used the excitation spots and traces as a random source seed?
I did a quick search and found this project building one:
http://www.inventgeek.com/alpha-radiation-visualizer/
I am one of the scientists working on this project. Our goal is to incorporate a quantum random number generator based on this prototype into the NIST Randomness Beacon, which will allow you to play with it. An even longer term and more ambitious goal is to shrink its size so that it can fit into a mobile phone, but that will take significant advances in the technology and years of work. If you can be patient, there is no need to call your real estate agent.
On the other hand, we hope that other organizations will make public random sources that are compatible with NIST's. Then one can combine output from all sources to create a random source that is more trustworthy than any single source.
What about radioactive decay? Sure, one can readily know the average, but what any particular isotope decays into is still unpredictable. Similarly, if you create a truly rand string of numbers, randomly chosen from 1-10, you will also know the average over time. Thus a random string correlated to radioactive decay should be possible.
It's not only got to be genuinely random (as John von Neumann said; "Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin"), but if someone else has generated the randomness you are using for your Certificate Signing Request, you cannot guarantee the security of your website ever after.
Hence the larger and more complex the apparatus, the less likely it is you've been fully able to verify it doesn't contain any unwelcome secrets or hidden backdoors making the output observable, predictable or being capable of manipulation by unwelcome parties. A simple electronic circuit you've built yourself involving a pair of zener diodes as a noise source followed by some analogue amplification and digital gates to ensure you get an even bias between 1s and 0s might be as good as it gets in this particular space. If you have to buy hardware made by someone else, paying for it cash in person makes it less likely to be replaced within the delivery chain. IBM used to advise mainframe managers to use dice for system passwords, but we need more entropy for long term and session secrets nowadays. It's possible the hardware RNG vendor may be fully security audited, but what about the delivery chain ?
if someone else has generated the randomness you are using for your Certificate Signing Request, you cannot guarantee the security of your website ever after
You cannot "guarantee the security" of any system, ever, under any conditions. That is not a meaningful claim.
There are plenty of physical processes that generate sufficient information entropy to seed cryptographic pseudorandom number generators (CPRNGs) sufficiently for most purposes. CPRNGs only need to raise the cost to the attacker higher than the attacker's evaluation of the value of breaking the CPRNG.
The vast majority of X.509 certificates will never be used to secure sufficient value to justify trying to break the CPRNG used to generate their precursor CSRs.
It's true that attacking CPRNGs has been successful in many prominent historical instances; the original Netscape SSL implementation and the Debian OpenSSL break are two well-known examples. But in the vast majority of cases there are cheaper vulnerabilities, and almost no one has a use case that requires a provably random entropy source.
You can use a zener diode and differential amp to null out PSU noise. Then ADC. The noise is random and due to quantum effects. It might want to be in an regulated oven... not sure.
It's actually really hard to remove external periodic interference making it non-random.
Any approach without a hardware generator is a delusion. It will be deterministic.
...that ansibles work via quantum entanglement and are instantaneous throughout the entire universe, so 187 feet is not enough separation. It is obvious that the fact that they are looking at the second photon while measuring the state of the first is causing errors because of Heisenberg's uncertainty principle and these errors only appear to be random. The problem is that the heat death of the universe may occur before a repetition of data is found
I am one of the "boffins" involved in this research at NIST. Both our and ANU's random sources exploit quantum physics to generate randomness. The important difference between our experiment and theirs is that ours uses entangled photons whose randomness is certified by violation of a Bell Inequality. This allows confidence in the unpredictability of our randomness based on the observed correlations in the data. The proof of the randomness does not require knowledge of the workings of the experimental devices.
Doesn't the fact that we observe the value actually determine what it is? In which case, what's to stop someone being paid a lot of money to determine, in advance, that it will be '4' every time?
What we need is a random observer. I'm pretty random and I promise not to accept bribes unless
1. I really want to;
2. I'm confident I'll get away with it.
Now consider a determined state-level attacker such who might be interested in intercepting encrypted communications on a targeted and international level.
Now he just needs a satellite picture of the sky over your head at the time they took the photo to stand a good chance of knowing enough to predict some of your "random" numbers to a certain extent.
Not all use-cases are as simple to combat as you think, when you're talking encryption that you expect one day a government or military might use itself and/or might not want you to use.
Random numbers are hard. Much harder than you might think. And tiny deliberate influences can drastically alter the security of them. There's a reason that there are entire books on the subject, and where most of the current traditional techniques - even on input data we're convinced is pretty random to start - revolve around hashing, mixing, eliminating higher-order bits, melding into existing pools, preserving historical pools to use for future mixing, selecting, analysing viability of and plucking numbers from random pools, etc.
Your "random input" might well be considered untrusted external data, in effect. Someone who really wants to corrupt that pool could do so quite easily if they were determined. Hell, just by cutting your CCD and hoping you weren't checking the image wasn't all-black. 90% of handling random numbers (and 90% of coding errors where they are mishandled resulting in a security problem) is about taking only selected parts that are more likely to be random and incorporating them in such a way that their randomness leaks through but not any determined pattern or bias that may be present. The other 10% is actually getting something that looks random enough to use as source data and could probably be trusted.
Hint: The Debian versions of OpenSSL software generated millions of certificates signed on such systems with atrociously insecure keys by failing to use proper random input and nobody noticing (they seeded from process ID, not an RNG, which varied but not truly randomly). For years. Once discovered, almost every key ever made on those systems was compromisable. Because all the fancy techniques in the world are for naught if your input isn't truly random or trusted.
Just the JPG-artifacts in an image could give a serious attacker enough bias to compromise your RNG. Or the resolution of a particular camera. Or the post-processing algorithms in the camera biasing pixels to generate a more natural image. Or the fact that someone knows the seed picture is of the sky might well give them enough.
As a mathematician, I have advice for people who aren't: Never think you understand randomness, encryption, statistics or probability. Just don't. Don't write code for them. Don't apply them to work things out. Don't dabble and think you understand everything. You'll make things weaker or more incorrect a billion times over before you make it stronger, no matter how clever or well-intentioned you are.
I'm fairly certain I could sit and derive a public-key encryption/decryption algorithm, a random number generator, etc. from first-principles given enough time and a programming language. I'm also 100% certain it would be useless to the point of utter compromise upon the first serious analysis by someone who understands those fields.
If you haven't read Numerical Recipes, go do so. It's got a maths-and-C-code heavy description of everything RNG, encryption, probability, etc. And that book is approaching 30-something years old and was never designed to cover hostile intent. It's currently holding up my coffee table, because it's thicker than my phone is wide.
Also a mathematician here. A fun game at work is to give people a look of horror when they ask me a question about statistics. These fields really are this specialized. A non-trivial part of the graduate education of a mathematician is to teach him just how bad almost all of his ideas are.
Please, please, PLEASE people--do not assume, unless you have gotten at A in the relevant coursework, or been similarly blessed by people who really do understand this stuff, that you can whip up some system that will be "good enough". You just don't know. Neither do I--and I know it.
as any claim for random number generation, this one is based on underlying assumptions about physical reality and our understanding thereof. You can always postulate hidden variables that would destroy the randomness (when we eventually understand and predict those hidden variables).
Alternatively, you can trust our current understanding of physics and build much simpler quantum detection devices. Avalanche amplification of tunneling events (cf. http://iank.org/trng.html or similar) is a sensible approach. The devil, of course, is in the details of the implementation.
@Schultz, NIST's experiment proves that if hidden variables are present and determining the output, those hidden variables must be able to communicate with one another faster than light. This is why we write that our random numbers are "certified by the impossibility of superluminal signals".
The important difference between our random source and one built from amplification of tunnelling events (or many other suggestions made in these comments) is that our randomness is certified by analysis of the data stream itself. The proof of randomness does not require knowledge of how the experimental devices are constructed. In fact, the devices could have been prepared by an adversary who wants to predict the random output. Nevertheless, quantum correlations in the data prove that the output is unpredictable.
For us, the devil is not in the details of the implementation. The devil is in the laws of physics (as we currently understand them) and the maths.
We assume that:
* Faster than light communication is impossible.
* The experimental devices maintain no quantum entanglement with potential adversaries trying to predict the random output.
* The distances between components and times between events are measured accurately (necessary for ensuring slower than light communication is not influencing events).
* The computers used for processing the data are reliable.
Easy. White Noise.
Sorry. It won't be perfectly White. Likely a tilt in the spectrum.
Okay then. Compensate the spectrum.
Sorry. Even then, it'll probably be slightly biased towards 0s or 1s, due to a tiny amplitude offset from zero as impacts the comparator trigger point, and correlated to local environmental variables.
Okay then. Perform a delta based on the sequential period to normalize it to exactly 50% each 0s/1s. Ha!
Sorry. It'll still be slightly biased due to tiny fixed timing offsets. You can't win for trying.
Okay then, go nuclear. Wire in a whole series of XOR gates (selectable inverters), each fed from stages of a huge binary counter, so that our signal is randomly inverted at an endless variety of periods. Hundreds of stages covering from the clock period to a thousand human lifetimes.
On paper, it looks good. But then there's some residual power line humm at -65dB, and that's showing up in an FFT of the supposedly random output. Sorry.
...
It's much harder than it might seem to overcome all objections. Close enough is easy.
Worth mentioning that real time stream versus data stored in a file probably doesn't make very much difference. As far as I know, they're precisely equivalent. Except that the file could have some extensive QA applied before shipment.
...as could the real time stream with a big enough buffer. .: Equivalent.
So you base your device on the noise generated by electronic components, like say a transistor or even a neon gas tube. Your numbers would be so random you could use them to run, oh, some kind of national lottery. Call the device Electronic Random Number Indicator Equipment. ERNIE see? Write the name in large, friendly letters on the outside. Heck, it's so obvious that you'd expect people to have thought of it decades ago.
Oh you can call me an optimist. Or you can call me 2AL 838710.
I'm sure that there is a mathematical proof of this somewhere (not that I'd be able to understand the maths) but is it really true? Lots of stuff have been assumed to be really random that later have been found not to be once we understand or model the underlying mechanism.
So, since we don't really understand quantum physics terribly well at the moment, how can we be confident that the statement is true?
@CrazyOldCatMan, there is a mathematical proof in our paper, published by Nature, and we believe that it is "really true".
As a scientific theory quantum physics is very well understood. It has been thoroughly tested over nearly 100 years in countless experiments. It is the foundation of the technologies that make the backbone of our modern economy. Physicists are beginning to despair because we are unable to find experiments in which quantum theory fails.
Of course, quantum theory might turn out to be wrong close to black holes or in the very early universe. The issue is not that it is poorly understood, but that it has not yet been tested in these regimes.
I always thought that the generator used for ERNIE (the Premium Bonds picker) was supposed to be completely random.
It might have changed in the years since I was up at National Savings, but the generator I saw relied on two metal discs with a pattern of holes drilled in them. These rotated in opposite directions and were stored up at a high potential difference. The time taken for a spark to jump the gap between them was used as a timing signal for a random seed generator (or something along those lines).
I think the catch here is that it was the best that could be developed at the time, but it doesn't preclude the possibility of an entity with sufficient resources to be able to replicate/simulate the setup and predict the sparks. The mathematical principles for this machine seem sounder (Bell's theorem, by definition a proven statement) unless someone breaks the whole quantum mechanics system.
Jolly good stuff but maybe a bit complicated.
Must be 40 years ago that we were using a noisy diode to generate a signal that could be converted into a random bit stream, from which we could peel off arbitrary numbers - total cost in modern terms about five quid. The randomness of that bit stream was driven by the uncertainty of the drift of electrons across the diode junction, which is quite adequate for most purposes.
BTW there's no such thing as a random number. Randomness is a property of series or sequences (and possibly sets) not of individual entities. It describes the independence of entities with respect to each other - which is why I refer above to arbitrary numbers.
I disagree. Randomness can apply to new members of a set (which can be empty OR solitary), describing the likelihood a new member of the set is in any way related to the existing members of a set; a truly random new entry would have NO relation to the existing members of the set.
"Something like a coin flip may seem random, but its outcome could be predicted if one could see the exact path of the coin as it tumbles."
Sorry, but that's plain wrong. Everything is a quantum process; there are just a zillion zillion quanta in a coin toss, but ultimately the one quantum that determines whether the coin lands heads or tails is unpredictable. (Yes, that only applies to the small fraction of coin tosses that are too close to call from Netwonian mechanics, because of measurement error, but philosophically a coin toss is just as unpredictable as the health of Schroedinger's cat, and for the same reason.)