A prototype of Intel's NNP-I chip.
Intel

Artificial Intelligence

Cheaper AI for everyone is the promise with Intel and Facebook’s new chip

Companies hoping to use artificial intelligence should benefit from more efficient chip designs.

Jan 7, 2019
A prototype of Intel's NNP-I chip.
Intel

Intel and Facebook are working together on a chip that should make it cheaper for big companies to use artificial intelligence.

The device promises to run pre-trained machine-learning algorithms more efficiently, meaning less hardware and less energy is required for AI to do useful stuff.

Intel revealed the new AI chip, as well as the collaboration with Facebook, at the Consumer Electronics Show in Las Vegas today. The announcement shows how intertwined AI software and hardware are becoming as companies look for an edge in developing and deploying AI.

The new “inference” AI chip could help Facebook and others deploy machine learning more efficiently and cheaply. The social network uses AI to do a wide range of things, including tagging people in images, translating posts from one language to another, and catching prohibited content. These tasks are more costly, in terms of time and energy, if run on more generic hardware.

Intel will make the chip available to other companies later in 2019. It is currently far behind the market leader for AI hardware, Nvidia, and faces competition from a host of chip-making upstarts. 

Naveen Rao, vice president of the artificial-intelligence products group at Intel, said ahead of the announcement that the chip would be more efficient than anything available from competitors, although he did not provide specific performance numbers.

Facebook confirmed that it has been working with Intel but declined to provide further details of the arrangement, or to outline its role in the partnership. Facebook is also rumored to be exploring its own AI chip designs.

Rao said the chip will be compatible with all major AI software, but the involvement of Facebook shows how important it is for those designing silicon to work with AI software engineers. Facebook’s AI researchers develop a number of widely used AI software packages. The company also has vast amounts of data for training and testing machine-learning code.

Mike Demler, a senior analyst at the Linley Group, which monitors the semiconductor industry, points out that competitors may have new designs to compare with Intel’s by the time the chip goes into production later this year. He adds that Intel effectively lags several years behind its rivals and needs to demonstrate “a big step up” with the new chip.

Intel was left flat-footed a couple of years ago as demand for AI chips exploded with use of deep learning, a powerful machine-learning technique that involves training computers to do useful tasks by feeding them large amounts of data

With deep learning, data is fed into a very large neural network, and the network’s parameters are tweaked until it provides the desired output. A trained network can then be used for a task like recognizing people in video footage.

The computations required for deep learning run relatively inefficiently on general-purpose computer chips. They operate far better on chips that split computations up, which includes the kinds of graphics processors Nvidia has long specialized in. As a result, Nvidia got a jump-start on AI chips and still sells the vast majority of high-end hardware for AI.

Intel kick-started its AI chip development by acquiring a startup called Nervana Systems in 2016. Intel then announced its first AI chip, the Intel Nervana Neural Network Processor (NNP), a year later.

Intel’s latest chip is optimized for running algorithms that have already been trained, which should make it more efficient. The new chip is called the NNP-I (the I is for “inference”).

The past few years have seen a dramatic uptick in the development of new AI hardware. A host of startups are racing to develop chips optimized for AI. This includes Graphcore, a British company that recently raised $200 million in investment, and an array of Chinese companies such as Cambricon, Horizon Robotics, and Bitmain (see “China has never had a real chip industry. Making AI chips could change that”).

Intel also faces competition from the likes of Google and Amazon, both of which are developing chips to power cloud AI services. Google first revealed it was developing a chip for its TensorFlow deep-learning software in 2016. Amazon announced last December that it has developed its own AI chips, including one dedicated to inference.

Intel may be late to the game, and it may need help from Facebook, but the company has formidable expertise in manufacturing integrated circuits, a key factor in design innovations and performance improvements. “Intel’s expertise is in optimizing silicon,” Rao says. “This is something we do better than anyone.”