Advertise here with Carbon Ads

This site is made possible by member support. โค๏ธ

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

๐Ÿ”  ๐Ÿ’€  ๐Ÿ“ธ  ๐Ÿ˜ญ  ๐Ÿ•ณ๏ธ  ๐Ÿค   ๐ŸŽฌ  ๐Ÿฅ”

Energy efficiency in computing

Moore’s law states that “the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years”. In the IEEE Annals of the History of Computing Jonathan Koomey states what you could refer to as Koomey’s law: “the electrical efficiency of computation has doubled roughly every year and a half”…which results in crazy stuff like:

Imagine you’ve got a shiny computer that is identical to a Macbook Air, except that it has the energy efficiency of a machine from 20 years ago. That computer would use so much power that you’d get a mere 2.5 seconds of battery life out of the Air’s 50 watt-hour battery instead of the seven hours that the Air actually gets. That is to say, you’d need 10,000 Air batteries to run our hypothetical machine for seven hours.