Wednesday, 4 October 2006

Will Moore's Law slow down?

I spent time last week with a friend who works in the chip industry. One of the things we talked about was heat, and what it's doing to the world of processor design.

For years there have been warnings that heat dissipation was starting to become a problem for microprocessors. Five and a half years ago, Intel's CTO said, "we have a huge problem to cool these devices, given normal cooling technologies." The industry made do for another five years, which is why I've been tuning out the latest round of warnings -- I figured it was just more Chicken Little ranting. But my friend is
worried that the problem is now approaching a critical point, and could have some important effects on the tech industry.

Look inside a modern PC and it's easy to spot the central processor and the graphics processor – they're the things with the big metal heat sinks on them, probably with fans on top to force air over the metal.



There's a limit on how big those sinks can get. Meanwhile, processor chips continue to get larger, which makes it harder to pull the heat out of them. And the metal lines on them continue to get smaller. Smaller lines are more likely to leak electricity, which heats the chip further. Already the average processor produces more heat per square inch than a steam iron, according to IBM technologist Bernie Meyerson. Extrapolate the trends into the future a bit, and things start to look ominous.

There are, of course, lots of potential ways to cope with the problem. Companies are talking about cutting the big processors into multiple small ones, which would be easier to cool (because you can put heat sinks around all the edges). Startups are proposing exotic cooling technologies like etched microchannels through which water could be circulated. My friend thinks optical computing technologies may play a role.

The thing all of these approaches have in common is that they're experimental. They might work, but they also might not. The implication is not that Moore's Law is coming to an end, but it's becoming much less reliable. Rather than getting a predictable increase in computing performance, we may end up with surges in progress that alternate with surprise periods of stagnation.

We may still be able to muddle through the whole situation; this isn't the first time people have predicted an end to Moore's Law. But it's worrisome because over the years, the assumption that Moore's Law will continue has been built into the thinking and structure of the industry. The computer hardware business depends on obsoleting its installed base on a regular basis. If that doesn't continue, sales will slow, which could force cutbacks in the R&D investment that's needed to solve the performance problems.

Instability in Moore's Law would also threaten nerd rapture, the Singularity. Creating transhuman intelligence depends on getting another 15 or 20 years of uninterrupted exponential growth in computer processing power. If that growth slows, the shipment date for our bio-cybernetic brain implants could start to slip seriously. Which would be a bummer – I've been looking forward to getting mine so I could finally understand the last episode of Twin Peaks.

No comments:

Post a Comment