After getting a Ph.D. from the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Lab in 1997, Timothy Tuttle never imagined it would take nearly 15 years for his education to find real-world relevance. Two years ago, he co-founded San Francisco-based Expect Labs to build software for a time when humans no longer hunt down information by typing queries into a text box on a PC. Instead, Tuttle sees the day coming when searches take place automatically in real time based on things people are saying and doing.
This fall, Expect Labs is releasing MindMeld, a free app that adds the company’s voice-powered search technology to the iPad’s videoconferencing capability. During a conversation between as many as eight people, the app picks up keywords and uses GPS to provide relevant, location-targeted information in an on-screen panel. “We’re attempting to glean insight by seeing where you are, seeing who you’re talking to, and listening to what you say,” Tuttle says. “We then use all that information to try and find stuff in advance of you actually needing to ask for it.”