Inside the Arctic Circle, Where Your Facebook Data Lives
Every year, computing giants including Hewlett-Packard, Dell, and Cisco Systems sell north of $100 billion in hardware. That’s the total for the basic iron—servers, storage, and networking products. Add in specialized security, data analytics systems, and related software, and the figure gets much, much larger. So you can understand the concern these companies must feel as they watch Facebook publish more efficient equipment designs that directly threaten their business. For free.
The Dells and HPs of the world exist to sell and configure data-management gear to companies, or rent it out through cloud services. Facebook’s decision to publish its data center designs for anyone to copy could embolden others to bypass U.S. tech players and use low-cost vendors in Asia to supply and bolt together the systems they need.
Instead of buying server racks from the usual suspects, Facebook designs its own systems and outsources the manufacturing work. In April 2011, the social networking company began publishing its hardware blueprints as part of its so-called Open Compute Project, which lets other companies piggyback on the work of its engineers. The project now sits at the heart of the data center industry’s biggest shift in more than a decade. “There is this massive transition taking place toward what the new data center of tomorrow will look like,” says Peter Levine, a partner at venture capital firm Andreessen Horowitz. “We’re talking about hundreds of billions if not trillions of dollars being shifted from the incumbents to new players coming in with Facebook-like technology.” (Bloomberg LP, which owns Bloomberg Businessweek, is an investor in Andreessen Horowitz.)
The heart of Facebook’s experiment lies just south of the Arctic Circle, in the Swedish town of Luleå. In the middle of a forest at the edge of town, the company in June opened its latest megasized data center, a giant building that comprises thousands of rectangular metal panels and looks like a wayward spaceship. By all public measures, it’s the most energy-efficient computing facility ever built, a colossus that helps Facebook process 350 million photographs, 4.5 billion “likes,” and 10 billion messages a day. While an average data center needs 3 watts of energy for power and cooling to produce 1 watt for computing, the Luleå facility runs nearly three times cleaner, at a ratio of 1.04 to 1. “What Facebook has done to the hardware market is dramatic,” says Tom Barton, the former chief executive officer of server maker Rackable Systems. “They’re putting pressure on everyone.”
The location has a lot to do with the system’s efficiency. Sweden has a vast supply of cheap, reliable power produced by its network of hydroelectric dams. Just as important, Facebook has engineered its data center to turn the frigid Swedish climate to its advantage. Instead of relying on enormous air-conditioning units and power systems to cool its tens of thousands of computers, Facebook allows the outside air to enter the building and wash over its servers, after the building’s filters clean it and misters adjust its humidity. Unlike a conventional, warehouse-style server farm, the whole structure functions as one big device.
To simplify its servers, which are used mostly to create Web pages, Facebook’s engineers stripped away typical components such as extra memory slots and cables and protective plastic cases. The servers are basically slimmed-down, exposed motherboards that slide into a fridge-size rack. The engineers say this design means better airflow over each server. The systems also require less cooling, because with fewer components they can function at temperatures as high as 85F. (Most servers are expected to keel over at 75F.)
When Facebook started to outline its ideas, traditional data center experts were skeptical, especially of hotter-running servers. “People run their data centers at 60 or 65 degrees with 35-mile-per-hour wind gusts going through them,” says Frank Frankovsky, Facebook’s vice president of hardware design and supply chain operations, who heads the Open Compute Project. Its more efficient designs have given the company freedom to place its data centers beyond the Arctic. The next one will go online in Iowa, where cheap wind power is plentiful. The company has also begun designing its own storage and networking systems. Frankovsky describes the reaction from hardware suppliers as, “Oh my gosh, you stole my cheese!”
HP has responded by unveiling a server and networking system called Moonshot, which runs on extremely low-power chips and stands as the company’s most radical data center advance in years. HP is also working on servers that increase efficiency through water cooling. The company has no problem saying Facebook’s designs were a major impetus. “I think Open Compute made us get better,” says HP Vice President Paul Santeler. “It’s amazing to see what Facebook has done, but I think we’ve reacted pretty quickly.”
By contrast, Cisco downplays the threat posed by the project’s designs. Few companies will want to deal with buying such specialized systems that were designed primarily to fit the needs of consumer Web companies, says Cisco spokesman David McCulloch: “Big picture, this is not a trend we view as detrimental to Cisco.” Dell created a special team six years ago to sell no-frills systems to consumer Web companies, and its revenue has grown by double digits every year since. “It sure doesn’t feel like we’re getting driven out of business,” says Drew Schulke, Dell’s executive director of data center solutions.
The custom hardware designed by Web giants such as Google and Amazon.com has remained closely guarded, but Facebook’s openness has raised interest in its data center models beyond Internet companies. Facebook has provided a road map for any company with enough time and money to build its own state-of-the-art data megafactory. Executives from Intel and Goldman Sachs have joined the board of the Open Compute Project’s foundation, a 501(c)(6) corporation chaired by Facebook’s Frankovsky. Taiwanese hardware makers such as Quanta Computer and Tyan Computer have started selling systems based on Open Compute designs. Facilities on the scale of Luleå, which can cost as much as $300 million to build, will continue to be outliers, but companies of all sizes can take advantage of the cheaper, more power-efficient equipment.
Wall Street has tried to push mainstream hardware makers toward simpler, cheaper systems for years, but its firms didn’t have enough purchasing clout, says George Brady, an executive vice president at Fidelity Investments who’s bought Open Compute-based systems. “We have tens of thousands of servers, while Google and Facebook have hundreds of thousands or millions of servers,” he says. Now, though, Fidelity can afford to build its own data centers in places like Omaha, where it also has found cheap land and power. “Facebook is getting us to these common components,” says Brady. “It’s like the work done 100 years ago on the automotive assembly lines to nail down the key principles behind a big industrial movement.”