Purchase price vs Power usage
Servers are expensive... around the price of an automobile. Their power consumption is relatively moderate these days. The purchase price of the server can far outstrip the cost of electricity to operate it, particularly if you can choose to locate your data centre somewhere with inexpensive electricity and moderate cooling needs.
Dell estimates 3MWH/yr on a heavy workload for their fully kitted-out R740 servers:
Even using the UK average of £0.28 per kWh, that would be just £840/yr. At 5 years, that's £4200. You'll find that a fully populated new server costs considerably more than that, and that's not even accounting for the much lower electrical rate Microsoft pays. Locating close to cheap electricity is a trick Aluminum smelters have been doing for decades. A newer server likely won’t cut your power usage in half or double your performance, so it will take quite a while to show a return on the investment. And if your server isn’t under such a heavy workload, the power consumption will be quite a bit lower and the payback period much longer.
And don't chime-in about cooling. Data centres don't need the cryogenic temperatures they once did. 30C/85F is a common operating temperature for server these days.
Facebook's data centre tour is a good explanation of the technologies going into data centres these days: