Sure, It Can Drive, But How Is It At Changing Tires?

If you spot a vehicle without a driver cruising down a Pittsburgh-area road, it's probably not a Hollywood stunt for a remake of The Invisible Man. Chances are, it's Alvinn, the prototype of a self-driving car created by a band of graduate students and professors at Carnegie Mellon University.

Alvinn (Autonomous Land Vehicle In a Neural Network) is one of the most promising demonstrations of the powers of neural network technology - computer programming that simulates how neurons in the human nervous system relay millions of bits of data to the brain for processing. In Alvinn's case, incoming video signals are fed to a network of computers that constantly matches them against the thousands of video images it already has learned, such as trees, parked cars, and pavement.

SNOOPING JEEP. A self-driving vehicle has been high on the agenda at the Defense Advanced Research Projects Agency (DARPA) for nearly a decade. Scientists there envision the day when a driverless jeep could scout danger zones, detect land mines, and snoop around behind enemy lines without endangering the lives of soldiers. But writing computer programs that mimic all the coordinated decisions that a human driver unconsciously makes has proved to be harder than first imagined. After seven years and $22 million in research expenses, Martin Marietta Corp. produced a prototype that guided by conventional computers, chugs along at about 10 mph and has trouble staying on the road for more than a mile or two.

The Carnegie Mellon team, led by research scientist Dean A. Pomerleau, began working on a neural-net chauffeur around the time DARPA launched a $33 million, three-year push into brainlike computer circuits in late 1989. Funding went to several projects, including one at Westinghouse Electric Corp. (table) as well as Alvinn. So far, CMU researchers have spend about $10 million, provided by Digital Equipment Corp. and the National Science Foundation as well as DARPA.

Neural nets are also a big research subject in Japan, where the Ministry of International Trade & Industry (MITI) plans to kick off the Real World Computing Project on Apr. 1. With as much as $1 billion in funding over 10 years, its aim is to explore a range of technologies such as neural nets that are capable of "flexible" computing, which is patterned on the free-flowing thought processes of the brain.

Back in the U.S., a Stanford University professor and medical doctor named Greg Kovacs is working on what he calls a nerve chip. It's a microchip with thousands of tiny holes that could, in theory, reconnect damaged nerves. Bernard Widrow, a Stanford professor of electrical engineering, suggests that it could one day help restore feeling to a hand that was severed in an accident. The idea, he says, is to feed the nerve endings from the hand to the arm through the chip's tiny holes. "The neural net would learn how to remap the pathways," says Widrow. In tests on a monkey, some nerve functions were restored, he says.

EXIT ANXIETY. Alvinn is much closer to proving the practicality of neural-net applications - such as robots that "see." Alvinn learns to drive by watching a human do it for about five minutes. Once its cameras record how a person drives down a two-lane highway or a dirt path, it never has to learn that particular road pattern again.

Currently, it knows about a dozen different road types. A network of onboard computers--Sun Microsystems Inc. workstations named Moe, Larry, Curly, and Shemp--process the data from video. Their programming tells them how to react to each type of object, braking for a pedestrian, say.

By last summer, the first Alvinn, a modified Chevy van, was cruising suburban Pittsburgh, pausing at stop signs, swerving to avoid dogs, and otherwise doing a good job of pretending to be human. By last fall, when the system was transplanted to an Army ambulance, it set an autonomous-land-vehicle speed record of more than 55 mph while covering a distance of 21 miles. (During such tests, a human usually sits behind the wheel, as an emergency backup.)

But Alvinn still has some glitches, admits Charles E. Thorpe, the CMU professor who manages the project. On highways, it has a tendency to exit at every off-ramp: It positions itself in the righthand lane by tracking the edge of the roadway, but it can't always distinguish between an ordinary curve and an offramp. Also, when it snows, the video cameras that specialize in perceiving colors lose the ability to see the black road and the yellow divider lines. "It has some of the same troubles that humans experience in a whiteout," he says.

So who needs a self-driving car? Maybe nobody but the Pentagon. Still, Pomerleau foresees commercial applications, including a sort of robo- mailman. Using a robotic arm and computerized mapping software, a self-driving van would travel up and down streets, tossing newspapers and mail packets on front lawns. If the vehicle came up against an obstacle it didn't recognize, it could brake, call a human dispatcher, and get specific instructions.

The system could also be miniaturized and packaged with conventional cars - as the ultimate form of cruise control, Pomerleau predicts. He's afraid, however, that high product liability insurance would prevent such a product from hitting the market. "We don't think we'll ever perfect it," he says. "People aren't 100% reliable in their driving--so how could this be?"

      These projects are on the frontier of neural networks, in which a computer 
      "recognizes" patterns such as the shapes of objects
      AUTOMATIC AUTO Carnegie Mellon is developing an "autonomous land vehicle" 
      called Alvinn that can cruise around town using video camera "eyes" instead of 
      a human driver
      NERVE MENDING at Stanford University, researchers are testing a computer chip 
      that can help compensate for damaged human nerves.  If your hand were severed 
      and reattached, the nerves from your arm could relay signals through the chip 
      to your hand
      JAPANESE FUTURISM  MITI's Real World Computing Project, sequel to the 
      disappointing Fifth Generation artificial intelligence project, kicks off in 
      April with as much as $1 billion in funding.  The 10 year goal: An intuitive 
      learning machine
      MACHINE VISION Westinghouse is using neural networks to recognize the image of 
      a tank and make a friend-or-foe decision by matching the image against those of 
      different tanks in its memory
      LANGUAGE TRANSLATOR The National Security Agency is creating a program called 
      Tipster that would read stacks of Japanese documents, translate them into 
      English, and, based on content, electronically forward each one to the right 
Before it's here, it's on the Bloomberg Terminal.