Getting Graphics Chips to Think on Their Own

Nvidia’s processors are powering breakthroughs in deep learning.

Jen-Hsun Huang, Nvidia’s chief executive officer, holds a Nvidia Drive PX auto-pilot computer at the 2015 GPU Technology Conference.

Photographer: David Paul Morris/Bloomberg

Nvidia’s microprocessors have long been the chips of choice for computer game addicts who crave realistic graphics as they chase aliens or battle enemy soldiers. The same powerful semiconductors are now being put to new uses at companies including Alibaba, Facebook, Google, and Microsoft. Nvidia’s graphics chips underpin speech recognition systems, software to develop gene therapies, and programs that transform satellite photos into detailed maps.

Researchers at DeepMind, a Google-owned lab in London, harnessed thousands of Nvidia’s K40 graphics processors, which cost $3,000 apiece, to train a computer to play Go, an ancient board game. In what was praised as a milestone in artificial intelligence, DeepMind’s machine beat a European Go champion in five out of five matches last year. In March it will take on the world’s top-ranked professional player.

Artificial intelligence’s big advance over traditional software is that it can learn and improve without the assistance of human programmers: An AI program designed to pick out cars from random images gets better the more pictures it’s exposed to. Graphics processing units, or GPUs, are well-suited for this kind of pattern recognition work because they can perform thousands of simple calculations at the same time. In contrast, standard central processors made by Intel perform more complex calculations very quickly but are limited when it comes to doing multiple things in parallel.

The concept of using graphics chips for AI got a big boost in 2012 when a team of researchers at the University of Toronto used Nvidia’s GPUs to build an award-winning image classification system. The breakthrough was helped by the chipmaker’s support of a programming language called CUDA, which lets developers repurpose GPUs for uses other than graphics. Rival Advanced Micro Devices hasn’t made a comparable investment, which has hampered the adoption of its graphics chips in this emerging field. Nvidia says about 3,500 businesses and organizations are using its GPUs for AI and data analysis, up from 100 a couple of years ago.

AI plays a role in everything from Google searches to self-driving cars, which is “one reason we’re optimistic on [Nvidia’s] data center business,” says Craig Ellis, an analyst at B. Riley, a boutique investment bank. “Their parallel-processing architecture is just naturally superior on an increasing number of workloads, which includes AI,” he says.

Data centers are a relatively new area for Nvidia, which draws the bulk of its $5 billion annual revenue from its PC graphics business. While it’s eked out growth as computer gamers continue to shell out for more powerful components, the company needs to counteract a four-year slump in PC sales. “Our GPU is now moving from software development into hyperscale data center production. That’s quite exciting,” says Chief Executive Officer Jen-Hsun Huang. Once a company figures out how to apply AI to its business, it tends to buy a lot of GPUs, he says. Still, luring customers away from Intel’s Xeon processors, the heart of more than 99 percent of the world’s servers, may prove difficult.

Nvidia will also face competition from startups, such as Movidius and Nervana, that are building AI-optimized chips. Nvidia’s chief scientist, Bill Dally, says some large companies, which he won’t name, are looking to do the same but they don’t pose a threat. “Nvidia really took a bet on this type of computation, and they invested in this field before it was obvious there was a market there,” says Serkan Piantino, director of engineering for AI Research at Facebook, which uses thousands of Nvidia GPUs for AI. Still Piantino is keeping his eyes peeled for new developments. “There’s a lot of promising stuff that’s going to land in the coming year,” he says.

The bottom line: Nvidia’s chips are being used to teach machines to think like humans, which could provide the company with a new line of business.

Before it's here, it's on the Bloomberg Terminal.
LEARN MORE