Those Superfast Chips: Too Darn Hot
Intel's (INTC ) recent announcement that it plans to produce new "dual-core" processors that amount to two Pentiums on a single chip drew attention mainly from hard-core techies. But it was an admission that the company's strategy for making PCs ever cheaper and faster has hit a wall: The chips are simply getting too hot. Further progress will require new technologies.
All chips generate heat that increases with speed. Keeping processors and other electronic innards cool has long been a challenge with laptops -- simply because cramming everything into a slim box causes heat to build up fast, and there is little room for fans and cooling vents. But cooling today's fastest chips is becoming a challenge in even the biggest desktop towers. Unless someone finds a way to get more speed without overheating, we won't see the kind of chip performance that the next generation of software will demand.
Intel's top-of-the line processor, the 3.6-gigahertz Pentium 4 Extreme Edition, pumps out a maximum of 115 watts of heat. A high-end graphics card, memory chips, and a disk drive can easily add 100 watts or more. Think of a couple of 100-watt lightbulbs burning in a shoebox, and you get an idea of what the engineers are up against.
ONE INNOVATIVE APPROACH favored by gamers works like an automotive cooling system: A circulating fluid draws heat from the chips and dumps it through a radiator. But liquid cooling is far too expensive for mainstream desktops: The superfast Alienware Area-51 ALX, for example, starts at $4,500. Better mechanical design can help in more conventional PC products, where thermal engineering has always been an afterthought. Computer makers, led by Gateway (GTW ), are starting to migrate to a new Intel system design, called BTX, in which the PC is planned from the motherboard up to promote airflow for better cooling. Apple Computer (AAPL ) also has done some brilliant engineering to air-cool its G5 desktop Power Macs and iMacs while keeping them very quiet.
Ultimately, though, the problem will have to be solved by reengineering the chips themselves. The main method for getting more performance out of processors is to cram more transistors onto chips and to run them at higher speeds. But the faster you run a processor, the more power it consumes, and the more heat it throws off. Historically there have also been energy benefits as components shrink, because smaller devices require less power. But with the latest Pentium 4s, Intel hit diminishing returns on shrinking the components: The chips actually got hotter.
Ongoing changes in chipmaking technologies will help. In the longer run, however, dual-core processors are the best hope. IBM began using the approach in the Power microprocessors in servers back in 2001, and a version of that chip may be used in Macintoshes next year. Advanced Micro Devices (AMD ) announced dual-core plans earlier this year, and Intel says it will offer such chips in 2005.
Since new PCs already offer more speed than most consumers use, why does this matter to anyone but hardcore gamers, engineers, and the like? The answer is that software is demanding much more from hardware these days. Your word processor, Web browser, and e-mail program don't overburden the chip. But by now you also should be running antivirus software, an antispyware program, and a firewall -- all of which soak up computing power. Data encryption is very processor-intensive, too. And the next version of Windows will have a 3-D user interface that will put a huge strain on the hardware.
The upshot is that over the next few years we won't be seeing the kinds of gains in computing speed that everyone has grown accustomed to. Heat problems increasingly will impose speed limits. As for dual-core PC chips, these will probably be widely available by 2006 -- just in time for the "Longhorn" version of Windows to come along and scarf up all that extra processing power.
For a collection of past columns and online-only reviews of technology products, click here