All Timothy J. Berners-Lee wanted at first was a simple computer program that could tie figurative reminder strings on his fingers to help him remember associations. In 1980, while working at Cern, the European Organization for Nuclear Research near Geneva, the British-born computer scientist was having trouble keeping track of which computers Cern physicists were using to analyze what experiments. Then he decided to extend the virtual memory links hither and yon to Cern's 5,000 researchers -- and then to anyone with a computer anywhere. His simple program was the Big Bang that exploded into the World Wide Web, launching a still-unfinished revolution and touching countless lives.
Until the Web made its debut in 1991, though, most people had a tough time grasping the need for such connectivity. Even Cern management was skeptical. Today, muses Berners-Lee, what's hard to explain is why the concept ever seemed abstruse.
Perhaps the same will prove true for his encore effort. As head of the World Wide Web Consortium, based at Massachusetts Institute of Technology, Berners-Lee is coordinating a global team bent on hatching the Semantic Web. As the name implies, the new Web will help computers understand one another automatically. With it, machines will do at electronic speeds many chores that now require a human. They'll be armed with tools that enable them to follow links independently and extract information from disparate databases, spreadsheets, and digital images.
The Semantic Web works, says Berners-Lee, by describing people, objects, and events in the real world "and then explaining to the computer how the data relate to those things." For example, cancer researchers will be able to dispatch programs that cross-correlate all relevant data on the new Web, including information stored in incompatible chemistry and physics archives that doctors and biotech researchers may have never seen. But Berners-Lee admits that selling this vision has been difficult.
Discovering what drives Berners-Lee can also be a challenge, in part because he is notoriously protective of his privacy. Clearly some of his choices were idealistic. In 1990, at age 35, he chose not to patent his Web software. (Doing so might have made him -- not Bill Gates -- the world's richest person.) Berners-Lee worried that if the Web were based on proprietary software, it would trigger the development of many other webs by the likes of Microsoft (MSFT ), IBM (IBM ), and Apple Computer (AAPL ). If they weren't compatible, his vision of connecting everyone everywhere would go nowhere.
Entrepreneurs who cashed in on Berners-Lee's brainchild -- folks such as Jeffrey P. Bezos of Amazon.com (AMZN ) -- have long said that their benefactor deserves a juicy reward. He finally got it last June. A week after his 49th birthday, Berners-Lee went to Helsinki to collect 1 million euros, or $1.2 million, as the first winner of the Millennium Technology Prize. Funded by Finland's government and industry, the award honors achievements in technology, since there is no Nobel prize for engineering or computer science. In Helsinki, Berners-Lee modestly explained that the Web is a composite of lots of existing things, to which he contributed only a wee bit. Maybe. But that bit brought the world a lot closer together.
By Otis Port