Re: Sub-second accuracy
That's not correct. Google use this for their NoSQL solution. Update one comes in a 11:42:10.100 and update two comes into a different server at 11:42:10.102 - so update two wins in the eventually consistent model. If they don't do the time smearing, then events can get out of order and applications end up having incorrect data.
You may not agree with the approach of using smeared NTP over their large numbers of servers to perform this but it works very well for them and they seem to have managed distributed transactions better than almost anyone else.