Nvidia Corp. announced new processors Monday to try to embed its products in artificial-intelligence systems that are increasingly becoming part of daily life.
The chipmaker, which dominates in video gaming, rolled out graphics chips for running software that makes split-second decisions needed when everything from phones to cars to internet search engines respond to inputs such as speech, images and moving objects.
The company said its new Tesla P4 chip is for servers used in massive data centers. Based on its Pascal design, the P4 is more than three times as efficient at processing images than its predecessor and 40 times more efficient than Intel server chips, according to Nvidia. Another new chip, called the P40, is designed for more-powerful single computers, such as supercomputers.
Nvidia is taking aim at its Santa Clara, California-based neighbor Intel Corp., which last month announced its own AI chips and talked about its ambition to muscle in on this nascent but fast-growing market. Both companies also want to ensure data center operators such as Google not only use their technology but aren’t tempted to design their own custom solutions.
Nvidia has argued its graphics chips, which perform multiple small manipulations of data simultaneously, are the right answer for AI systems, and it has invested in software to make them easier to use. Intel has said its chips, which have less ability to work in parallel but are more capable in general purpose computing, offer the right solutions. Data center owners, whose biggest cost is energy, are focused on components that can get the job done using less power.
It’s still early days in the market for AI chips. In its latest quarter, Nvidia’s data center business more than doubled sales to $151 million and most of this came from AI tasks. Intel’s data center unit had sales of $4 billion, but very little of that revenue came from AI projects.