The numbers, in billions of kilowatt-hours, break down as follows:
Data center servers: 45
PCs and monitors: 235
Networking gear: 67
Phone network: 0.4
That amounts to about 350 billion kWh a year, representing a whopping 9.4% of total US electricity consumption. On a global basis, Sarokin estimates that the computing grid consumes 868 billion kWh a year, or 5.3% of total consumption.
I find these kinds of estimates interesting. It is interesting how little the phone network consumes in these figures (assuming they are close to accurate).
In these days where some people are concerned about their "carbon footprints", this estimate may get some visibility. How much does computing substitute for other forms of energy consumption (notably transportation of various forms)? Should computing therefore "count" as indulgences offsets?