Neural networks are a special breed of computer circuit, loosely patterned after the "wiring" in the brain. What sets them apart is their uncanny ability to learn from experience. For instance, they can be taught by repeated exposure to recognize faces and understand speech--feats that are extraordinarily hard for ordinary computers. But training a neural net is a time-consuming chore, especially when the network must be simulated with a regular computer because of the paucity of neural-net hardware. The snail's pace of learning is crimping progress, says Les E. Atlas, director of the Interactive Systems Laboratory at the University of Washington. He is itching to get his hands on the lightning-fast neurocomputer that will be introduced late this year by Adaptive Solutions Inc., a Beaverton (Ore.) startup.
The guts of ASI's system is a powerful chip with neural-net circuits embedded directly in the silicon. Developed by Inova Microelectronics Inc., the chip sports 64 neural processors that operate in parallel--a total of 11 million transistors, or nearly 10 times the raw power of Intel Corp.'s 486 microprocessor. And each neurocomputer has four such chips. On pattern-recognition tasks, boasts ASI, its $55,000 machine is 100 times faster than any supercomputer. Professor Atlas figures his work with a large data base of heart images, which now takes 15 weeks per training run, should shrink to just a few hours.