Something of a point
To elaborate on his stock market example, if you look at prices on a minute-by-minute basis, there's a tremendous amount of random fluctuation (i.e., lots of noise). If you only look at daily closing prices, you have a few orders of magnitude less data to process, and it's just as good for making medium- and long-term predictions. Of course, it's easy to go too far; monthly stock market updates might not provide enough data to extrapolate from with an acceptable degree of confidence. And there are exceptions: an automated arbitrage trading program might be able to make use of price updates as often as every second.
As for log files, many of them are useless, and most will never be looked at. But when security breaches happen, they're essential in figuring out how someone got into the system and what they accessed.
The point here is that companies need to work on collecting better quality data, not more of it.