There is a lot that can be done
With Intel servers average less than 10% utilisation, UNIX around 30% and over 50% of stored data being redundant according to industry averages, there is an awful lot that can be done to reduce energy usage. Over 60% of the energy consumption of the average data centre is not used by core IT equipment but buy cooling systems and power conditioning.
Virtualisation, information lifecycle management, automated provisioning, water cooling and dc power are all concepts that can be developed. Recovering heat from the data centre to use elsewhere, or even converting back to electricity, will become increasingly attractive.
Its not just about global warming - electricity is currently the second highest operating cost after people. Before long it will be the highest as the cost of electricity increases significantly over the next 5 years. There are already areas of the UK, including the City of London, where they can't physically get any more electricity down the cables in the street. Before the green issue hit the headlines a year ago, there was already concern about the 'energy gap' in the UK - we are running out of generating capacity - this was what triggered the Labour Government into discussions about building new nuclear generating capacity over two years ago.
The necessary changes will cost money and take time - most of the technology is available, tried and tested - a few more bits need to be sorted but IT management really do need to get their heads together and accept that they can no longer ignore their energy usage.