Fantastic Journeys In Virtual Labs

"Every great advance in science has issued from a new audacity of imagination."

Philosopher John Dewey's observation rings as true today as it did a half-century ago. Trouble is, science has gotten so complex since then that only a handful of geniuses can muster the imagination needed to cope with many of its concepts. Take the notion of cosmic strings. This theory proposes that the fundamental building blocks of matter are infinitesimally thin but enormously long threads of energy. In diameter, they're mere billionths the size of an electron. Yet they stretch across vast cosmological distances and into several "higher dimensions" beyond length, width, depth, and time. The vibrations of these strings supposedly tickle zillions of atomic particles--and thus the universe--into being.

If this weirdness boggles your mind, take heart: Even scientists have trouble conjuring something so alien to everyday life. But mathematics can. An equation describing a 10-dimensional space filled with pulsating spaghetti is no less concrete than 2+2=4. And what math can describe, computers can bring to life on a video screen. That's why "doing science" on computers--call it digital science--is fast becoming as indispensable as imagination.

In fact, digital science marks the most fundamental change in scientific methodology since Isaac Newton laid the foundations 350 years ago. Michael Warren, a physicist at Los Alamos National Laboratory, terms it "the transcendent technology of our time." Yet that accolade barely hints at its profound implications for business and society. Digital scientists have the temerity to envision a time, not far distant, when computers will help reengineer almost anything in Mother Nature's pantry. If you need a drug or a polymer or a metal that has never existed before, tomorrow's supercomputers would calculate whether electrons, protons, and neutrons could be assembled to yield the desired properties--and sketch on a screen how it might be done.

TIME WARPS. Such potential is rapidly putting computers at the cusp of all scientific endeavors. While the machines long ago became vital research tools, they mainly revealed the secrets of nature in mind-numbing printouts of numbers. Digital science is better partly because sight is by far the best human sense for exploring ideas. Being able to interact visually with a simulated experiment helps amplify intuition and creativity, says Robert Langridge, a biologist at the University of California at San Francisco. As a result, ordinary researchers have a better shot at making extraordinary discoveries.

Even more significant, digital science opens vast new horizons to scientific inquiry--many beyond the scope of any real-world laboratory. "A lot of things are possible now that weren't before," says Marvin L. Cohen, a theoretical physicist at the University of California at Berkeley.

Warping time is one example. Astrophysicists can scoop up enough cosmic dust to create a star, watch it wink to life, burn for billions of years, then flare into a supernova--all in minutes. At the opposite extreme, Scripps Institute of Oceanography researcher Charles Brooks has stretched an eyeblink into a workday. To get a handle on how the mysterious "unfolding" activity of proteins affects their function in the body, he split 1.5 billionths of a second of action into 750,000 images--enough for a seven-hour video.

There's almost no end to the new types of experiments digital science can do. At New York University, a three-dimensional model of a human heart--beating inside a Cray C90 supercomputer 365 miles away--uses mathematical muscles to pump digital blood. Before NYU medical researchers Charles S. Peskin and David McQueen had access to the Cray at the Pittsburgh Supercomputing Center, the simulation wasn't feasible. Even now, producing each heartbeat takes hours of number-crunching. But with their heart, which won the 1994 Breakthrough Computational Science award from the Smithsonian Institution and Computerworld magazine, Peskin and McQueen can probe for cardiovascular cures in ways that would otherwise be impossible.

In the same manner, biochemists at the University of North Carolina at Chapel Hill can tug the molecule of a potential anticancer drug to a so-called receptor site on a protein, then twist and turn it to see if there's a tight "fit," or bonding force, which determines whether the drug can be effective. A scientist actually feels the strength of the bonding force in the mechanical arm used to steer the molecule.

At Australia's Monash University, researchers last summer used similar tools to develop a "designer" molecule that seems to disable flu viruses--permanently. The trouble with flu bugs is that they mutate and quickly develop resistance to drugs. So scientists are forever developing new vaccines. But viruses may not be able to overcome the Australian antidote because it attaches itself to a mutation-resistant receptor on the virus and prevents it from reproducing. A perpetual flu shot could be a billion-dollar blockbuster for Glaxo Holdings, the drugmaker that sponsored the research. Glaxo launched human trials early this year.

Such near-omnipotent power is fanning a new spirit of adventure in research and engineering: It's encouraging people to tackle projects that used to be too risky or daunting. The upshot, many scientists believe, is that digital science is crucial to any nation that aspires to do world-class research--and to harvest that knowledge for the seeds of tomorrow's high-tech industries. "It's a universal tool that touches every part of science," says Malvin H. Kalos, director of Cornell University's Theory Center.

CAD CATS. Materials scientists on the trail of future microchips at AT&T Bell Laboratories, for example, can now gain fresh insights by doing a takeoff on Honey, I Shrunk the Kids: They dive into a silicon crystal to watch electrons whizzing through semiconductor circuits. In pursuit of more efficient engines, Detroit researchers can climb inside and inspect how gasoline gets dispersed into tiny droplets before igniting. Adolf Coors Co. uses digital models in its search for lighter beer cans. DuPont Co. even simulates disposable diapers, seeking to make them more absorbent. And Caterpillar Inc. saves hundreds of thousands of dollars by using digital prototypes to test designs for earth-moving equipment. Says Donald R. Krull, Cat's head of product research: "Every industry is going to end up doing this."

That's particularly true now that digital science doesn't always require a supercomputer. At Xerox Corp.'s Palo Alto Research Center (PARC), scientist Ralph Merkle runs his experiments on a workstation. He's exploring nanotechnology--building molecular machines by laying atom on atom like a bricklayer putting up a house. Thanks to desktops with the muscle of yesterday's Crays, he says, "we're seeing a democratization and mass spread" of digital science.

Like many technologies, digital science won its first stripes with the Pentagon. After World War II, Washington plowed billions into computers and software to support code-breakers and designers of high-tech weapons. In the process, the feds spawned the supercomputer industry, high-speed data communications, and computer networking. Many software packages now used by industry also have Pentagon ancestry. For instance, auto makers crash-test cars in computer simulations using a program based on Dyna-3D, which was written to ensure that an Air Force bomber's nuclear eggs wouldn't crack open if the plane crashed.

Until the late 1980s, most scientists outside the defense establishment could only look on with envy. Martin Karplus, a Harvard University theoretical chemist, recalls having to cadge time at night on a hospital's computer. And Larry L. Smarr, an astrophysicist at the University of Illinois, had to hop to Germany and borrow time on a Cray at the Max Planck Institut.

Smarr soon got fed up and joined mathematician Kenneth G. Wilson, a 1982 Nobel laureate then at Cornell, to lead a lobbying campaign in Washington. It culminated in 1985, when the National Science Foundation got $200 million to create four national supercomputer centers and a nationwide datacommunications system, the NSFnet, to link the centers to three dozen universities. All told, the NSF and its university, state, and 88 industrial collaborators have invested close to $1 billion in these centers. And the money keeps rolling in. In 1991, President Bush signed the High Performance Computing & Communications Act (HPCC), a five-year initiative that is authorized to spend $1 billion in fiscal 1994.

By 1996, the curtain will rise on the next act--a new type of supercomputer dubbed "teraops." At peak speed, these machines will chew through at least a trillion operations per second, or teraops--10 to 50 times as many as today's best number-crunchers. Such fantastic speeds are essential for tackling so-called Grand Challenge problems. These are the knottiest puzzles facing science, such as developing fusion-energy reactors and understanding the global climate (table, page 76).

To bypass an inherent speed limitation in traditional computers, teraops machines will be based on massively parallel processing (MPP), a technology pioneered by Thinking Machines, nCube, Intel Supercomputer Systems, and others. MPP systems have dozens to thousands of microprocessor chips that each pack the raw speed of an older Cray. "We need MPP speeds because we want to model an oil spill in real time, not get the results after several days and then go check to see if the model was right," says Frank L. Gilfeather, a professor of mathematics at the University of New Mexico and a co-designer of the new Maui High Performance Computing Center. "We want to predict what's going to happen--in time to go out and take preventive action."

At expected prices of up to $50 million, not many teraops machines will be installed before the turn of the century. But scientists attacking priority problems will have no trouble gaining access to one--wherever it may be. The NSFnet has evolved into the backbone of the Internet, and it now has links to 1,000 colleges plus an increasing number of fee-paying corporate sites.

Keeping the data humming to and from teraops computers will require the fast lanes of the Information Superhighway. These will be gigabit links, capable of pumping data by the billions of bits per second. Most of today's data pipelines top out at only 45 million bits per second, but MCI Communications Corp. is already installing a 2.5-gigabit network. That's enough to transmit all the text in the Encyclopaedia Brittanica in one second.

GIGABIT PLAYERS. Such speeds should also boost on-line collaborations, with scientists ganging up on complex problems in virtual labs that exist nowhere yet anywhere. Before the NSFnet was turned on in 1986, it was unusual for multidisciplinary teams to include researchers from more than one institution. Today, it's common--which means that research projects are tapping the best scientific brains in the world. Gigabit links will bring a major new dimension to on-line science. They'll enable biologists, chemists, and biophysicists in Austin, Boston, and Bonn to watch the progress of a digital experiment in all three places at once.

Gigabit networks also will have enough two-way capacity for remote experimentation. Researchers will thus be able to use the best piece of scientific equipment for the task, no matter where it's located. Ultimately, this could "create a distributed laboratory across this country," says Dr. Judith L. Vaitukaitis, director of the National Center for Research Resources at the National Institutes of Health. The outlines of this virtual lab are starting to take shape. The University of California at San Diego's medical school is experimenting with remote access to its high-voltage-transmission electron microscope--one of only six available to U.S. biologists. Now, a neuroscientist at the University of Tennessee can dial up the San Diego instrument to peer at 3-D images of brain cells.

At first, the heavy reliance on computers and mathematical models made many scientists nervous. Computer simulations are, after all, approximations. If a scientist really needed a model based on fundamental forces such as quantum chromodynamics and quantum mechanics--which, respectively, describe how quarks assemble into protons and neutrons and how electrons behave when orbiting an atom's nucleus--the simulation would be restricted to 10 or so atoms. Any more and even today's biggest supercomputer gags. So scientists devise clever shortcuts, judiciously omitting certain details. "That's where the art is," says Ralph Z. Roskies, co-director of the Pittsburgh Supercomputing Center. For example, digital imitations of the solar system treat the planets as monolithic bodies, not megamasses of atoms. And climatologists studying how the oceans influence weather patterns may ignore the sunlight reflected by clouds to squeeze in more data on ocean currents.

Juggling this so-called granularity of computer models is a constant headache. Almost always, some degree of accuracy must be sacrificed to get the job done in a reasonable time. Take the National Weather Service's 24-hour forecasts. These used to stem from a model that covered the U.S. with boxes 75 kilometers square. Inside these boxes, the weather was assumed to be uniform. Meteorologists knew this model's granularity was too coarse because thunderstorms and tornadoes could come and go in a cell unnoticed. But a finer grid would have taken so long to run that the weather could have arrived by the time the forecast was done. Earlier this year, the Weather Service upgraded from a Cray Y-MP to a faster Cray C90--and reduced the grid to 35-km-square boxes. Still, weather watchers are eagerly awaiting a teraops system.

CLINCHER. Similarly, older Crays at NASA's Ames Research Center didn't have enough muscle to simulate the flow of hypersonic air over a plane's entire fuselage as it moved at several times the speed of sound. So, they had to piece together the results from various sections. With a new Cray C90, however, far more detailed and realistic simulations are possible.

Such successes have gradually persuaded skeptics that digital science is for real. In 1989, Berkeley's Cohen predicted that carbon and nitrogen atoms arranged in a certain crystalline structure would be harder than a diamond. Scientists at Lawrence Berkeley Laboratory, Northwestern University, and Harvard then produced minute amounts of the new carbon-nitride material--though a process for fabricating commercial quantities has yet to be devised.

Another clincher was an IBM experiment that provided the best proof so far of quantum chromodynamics, or QCD. The equations laying out this esoteric theory are so convoluted that IBM built a custom-designed MPP computer with 566 processors to crunch the QCD math at top speed. The machine ran continuously for a year before the results popped out in early 1993. Physicists were cheered when the answers confirmed most interpretations of experiments with particle accelerators.

Just because a digital experiment works doesn't necessarily mean the results are valid. Computers can be used to model "a lousy experiment or a lousy theory," says Robert J. Silbey, head of Massachusetts Institute of Technology's chemistry department. Until other researchers spot the errors, he concedes, "we may go off in the wrong direction for a while." That's accepted within the culture of science. But it can present major problems when simulations are used to formulate public policy. For example, although climate models generally predict global warming, they contradict one another on details that would help decide what to do to counter the greenhouse effect.

HOT TOPICS. Still, the incidence of lousy models should diminish as digital science permeates classrooms and produces more graduates with good simulation skills. Among the more advanced projects is the University of Washington's molecular biology program, headed by Leroy Hood and backed with $12 million from William H. Gates III, chairman of Microsoft Corp. Digital science has become "essential and fundamental to biology," says Hood, for tracking down the connections between the estimated 50,000 to 100,000 human genes or determining how billions of neurons in the human brain interact to create thought and perception.

That's true not just in biology. "Complexity" is one of the hot topics in contemporary science, and proponents believe it will turn reductionist science on its ear. Traditional science has groped for understanding by reducing complex systems to their simplest elements. But this often falls short of a complete explanation, especially when so-called emergent behavior is a factor. Many natural systems made up of simple entities--ant colonies are a good example--exhibit a capacity for self-organization that can't be explained by the characteristics of the individual members. Complexity theory aims to discover how self-organization comes about.

If the secret is found, says John H. Holland, a professor of psychology at the University of Michigan, complexity simulations could one day serve as "flight simulators for social policy"--helping government leaders navigate the waters of partisan politics to achieve the greatest good for the majority.

That challenge may elude even teraops computers. But the next generation of supercomputers after them--petaops systems, which will spit out calculations by the quadrillions every second--should arrive early next century. Their work will be waiting.

DIGITAL SCIENCE'S GRAND CHALLENGES

Nine Washington agencies are coordinating their digital science efforts to solve 30-odd Grand Challenges--projects that are beyond the capability of any single researcher or lab. Here's a sampling:

COMPUTATIONAL BIOLOGY Understand the components of human and other genomes, and develop software for simulating biological structures. Various projects are being sponsored by the Energy Dept., the National Science Foundation, G.D. Searle, Carnegie Mellon University, and the universities of Tennessee, Houston, and Illinois.

MATHEMATICAL COMBUSTION MODELING Apply computational fluid dynamics to understand the combustion process, which may point to more fuel-efficient car engines. Managed by the

Energy Dept.

QUANTUM CHROMODYNAMICS CALCULATIONS Model the physics of

elementary particles in atoms and crystalline lattices. Funded by the

Energy Dept. and the National Science

Foundation.

NUMERICAL TOKAMAK PROJECT Integrate particle- and fluid-plasma models to guide development of fusion-energy reactors. Headed by the Energy Dept.

FIRST PRINCIPLES SIMULATION OF MATERIALS Develop advanced techniques to help discover new materials, using quantum chromodynamics and quantum mechanics. Funded by the Energy Dept. A related project on atomic-level simulations of materials is funded by the National Science Foundation and headed by the California Institute of Technology.

GLOBAL CLIMATE MODELING Expand the theoretical basis for climate dynamics, especially the role of oceans. Sponsored by the Energy Dept. Related programs, focusing on weather forecasting and modeling the dispersion of pollutants, are run by the National Oceanic & Atmospheric Administration and the National Science Foundation.