The Research Is First Class. If Only Development Was, TooJohn Carey
For decades, IBM has prided itself on its world-class research. Big Blue spends $6 billion a year on its sprawling R&D enterprise with major laboratories in the U. S., Europe, and Japan. And the company's scientists rank among the best anywhere. In 1986, they won the Nobel prize for inventing the scanning tunneling microscope, which enables researchers to spot individual atoms. In 1987, they won another Nobel prize -- for achieving high-temperature superconductivity. And in 1990, IBM scientists were the first to build structures only a single atom wide, a crucial step on the road to ultrafast chips.
But the company's struggle to reshape itself raises troubling questions: Will IBM's celebrated R&D labs fade as did other once-great industrial labs such as RCA's Sarnoff Center and Xerox's Palo Alto Research Center? In fact, if the turmoil undermines IBM's prowess -- especially in physics and semiconductor research -- that could deal a serious blow to U. S. competitiveness. "I'm very concerned," says Robert M. White, Under Secretary for technology at the Commerce Dept.
CRACKING THE CODE. Many IBM-watchers in academia, industry, and government, however, believe that the latest corporate shakeup may be just what the doctor ordered. IBM has been aggressive in developing new technologies for its core business -- such as new generations of memory chips. But critics say Big Blue has long suffered from a common American disease -- it too often fails to bring breakthrough technologies to market successfully. "They just haven't been able to crack the code of how to get worthwhile innovations out of a large company," says Cambridge (Mass.)-based computer communications analyst John McQuillan.
The reorganization could bring research closer to the needs of individual operating units and prevent the champions of existing products from squashing development of threatening new technologies. "Given IBM's history, it may be a healthy move," says a leading computer industry expert.
Take reduced instruction-set computing, or RISC. In the mid-1970s, IBM researchers pioneered this revolutionary technology for designing faster computers. But the advance wasn't rushed into products, largely because it was seen as a menace to Big Blue's core mainframe business. "Everything they do is protecting that mainframe market," says Alan Sims, chief engineer for Cigna Corp., a big IBM customer.
As a result, RISC technology wasn't commercialized until scientists at Stanford University and the University of California at Berkeley, funded by the Pentagon's Defense Advanced Research Projects Agency (DARPA), in effect reinvented it. Then upstarts like Sun Microsystems Inc. harnessed the technology -- and they now lead in RISC-based workstations.
Not only has fear of undercutting existing products held IBM back, but so has complacency. Berkeley computer scientist Michael A. Harrison recalls a visit by one of IBM's top software scientists, who described newly published research to programmers at an IBM development lab in California. But few of the programmers even knew which journal the scientist was referring to, says Harrison. "To be that ignorant, it's like having been in a cave for 20 years," he says.
Critics concede that even before the reorganization, IBM had been trying to tackle its R&D problems. It became the driving force in two industry research consortiums, Sematech and the Microelectronics & Computer Technology Corp. It also funded suppliers of advanced chipmaking technology. And it had begun to slash product development cycle times.
IBM has no plans to decentralize its research laboratories. But if the growing pressures lead it to shift its R&D priorities too much to the short term, it may be unable to make the big breakthroughs. Japan is increasing its investment in basic scientific research. To compete in 21st century technologies, American computer companies such as IBM will have to speed products to market and keep their blue-sky science strong.