All Apologies to GPUs and FPGAs At CES 2019 in Las Vegas this week, Navin Shenoy – Intel Data Center Group executive vice president, announced the Intel Nervana Neural Network Processor for Inference, which will go into production this year. Nervana was also reportedly working on the development of a custom chip for neural network processing at the time, which they claimed would outperform GPUs as AI accelerators by a factor of at least ten. They analyzed a number of deep neural networks and came up with what they believed to be the best architecture for their key operations. Intel hasn’t given specific performance or power consumption figures for the new devices except to say that power consumption will be in the “hundreds of watts” – which puts Nervana clearly in the data center (compared with edge-targeted AI devices such as the company’s Movidius and Mobileye offerings). Nervana seems to be aimed at GPUs’ and FPGAs’ increasing foothold as AI accelerators in the data center.
View Full Article