The problem with robots.txt is that as a developer or website operator it provides zero feedback in the event of a fuck up.
For example if you deploy a site from a development environment and forget to change "Disallow: /" in production.... the "feedback" is that your site might drop out of Google's index. If you're making money through, for example an ecommerce website, that could be a huge problem. At which point your only option is to wait for it to be reindexed after rectifying the problem.
Equally if you do it the other way round you can end up with a development site getting indexed and then have to deal with getting it removed from their index, which is a manual and time-delayed process.
There are third party tools including Google's Analytics and WMT that will notify you about problems like this, or the absence of that file within a webspace. But the default scenario is one where you might not know anything about what's happened until it's too late.