GE's Billion-Dollar Bet on Big Data
General Electric’s first research laboratory was housed in a barn in upstate New York; its newest is going up in Silicon Valley. In a vivid illustration of how the locus of U.S. innovation has shifted from the East to the West Coast, GE is pouring $1 billion into a facility in San Ramon, Calif., that will be staffed with as many as 400 people.
New hires for the Global Software Center, which is set to open in June, are coming from Oracle, SAP, and Symantec. Bill Ruh, the vice president running the venture, was lured away from Cisco Systems last year. The tech industry veteran says persuading developers to forgo windfalls from initial public offerings to come work at an industrial stalwart is not as difficult as one might think. “They want to be in on the Next Big Thing,” he says.
The big thing Ruh is referring to is called “big data,” the fast-growing market for information technology systems that can sift through massive amounts of data to help companies make better decisions. Just as information on millions of Facebook users is prized by advertisers, the details companies amass from their operations can be used to cut costs and boost profits. Norfolk Southern, which buys diesel locomotives from the Fairfield (Conn.) company, uses customized software to monitor rail traffic, reducing congestion and allowing trains to move at higher speeds. The fourth-largest U.S. railroad estimates that making trains run an average of 1 mile per hour faster will save more than $200 million.
The potential for such technologies is so huge that it’s impossible to come up with an estimate of how much the market is worth, according to Michael Chui, a senior fellow at McKinsey. “It’s just too big,” he says. That doesn’t mean there’s room for all comers, according to Ping Li of Accel Partners, a venture capital firm investing in big-data startups. “If you’re not getting in right now it’s hard to see how you can keep up with the pace of innovation,” he says.
GE’s annual revenue from software already is about $3 billion and on pace to grow to $5 billion in the next couple of years, Chief Executive Officer Jeffrey Immelt told investors in December. Ruh says he wants to marry big data with some of GE’s biggest businesses. He sees an opportunity in helping airlines that buy GE jet engines monitor their performance and anticipate maintenance needs, reducing costly flight cancellations. The technology could also help companies that lease commercial vehicles from GE Capital to optimize delivery routes and provide early warning that a truck may need a trip to the repair shop. “If I can begin to see that something is starting to deteriorate and get out there and fix it before it breaks, that’s a foundational change,” Ruh says. “In the end, what everybody wants is predictability.”
When it comes to big data, GE is playing catch-up to IBM. The world’s biggest computer-services company is working with energy companies to extend the lives of oil and gas fields by improving oil recovery through analytics. IBM also is working with Vestas Wind Systems to find better locations for wind farms. Newer entrants are jumping in as well. Splunk, a San Francisco-based startup that just went public, says its customer rolls exceeded 3,700 as of the end of January.
GE is counting on its expertise making industrial equipment—from gas-fired electrical turbines to locomotives—to give it an advantage over rivals focused on exclusively providing data solutions, says Ruh. “If you don’t have deep expertise in how energy is distributed or generated, if you don’t understand how a power plant runs, you’re not really going to be able to build an analytical model and do much with it,” he says. “We have deep insight into several very specific areas. And that’s where we’re staying focused.”