• Share this News :        


  • November 26, 2023
  • Shahala VP
Industry Shift: GPUs Yield to NPUs as Smartphones Embrace Advanced AI Processing

In the ever-evolving landscape of artificial intelligence (AI) in smartphones, the transition from GPUs to NPUs has marked a significant shift. As showcased in our recent Google Pixel 8 review, the applications of AI have expanded, encompassing everything from computational photography to on-device language models.

In the earlier days, GPUs were primarily designed for graphic tasks on smartphone screens. However, with the rise of AI applications, GPUs found a new purpose. Their ability to perform simultaneous computations became instrumental in running complex mathematical models required for AI, ranging from facial recognition to speech detection in voice assistants.

While GPUs significantly reduced processing times when combined with CPUs, their limitations became evident as AI models grew larger and more intricate. The demand for specialized AI hardware became apparent, leading to the emergence of Neural Processing Units (NPUs).

NPUs, specifically designed for neural network machine learning, addressed the shortcomings of GPUs. Developed with a "data-driven parallel computing" architecture, NPUs excel in processing massive multimedia data such as video and images. Unlike GPUs, NPUs focus solely on neural network tasks, offering a more efficient solution for AI computations.

The key advantage of NPUs lies in their ability to offload complex calculations from the main processor, enhancing efficiency and preserving battery life in smartphones. Google's TPU and Apple's Neural Engine are prime examples of NPUs making their mark in the industry.

While NPUs, GPUs, and CPUs each have unique characteristics, NPUs closely resemble GPUs in their focus on parallel processing. However, NPUs are more specialized, dedicating their processing power exclusively to neural network tasks, whereas GPUs exhibit versatility in handling a broader range of parallel computing tasks.

In terms of core count, NPUs typically have cores specialized for AI computations, differentiating them from CPUs with fewer but more powerful cores and GPUs with numerous cores optimized for parallel tasks

The adoption of NPU silicon in smartphones has seen rapid growth across major vendors. Apple set the trend with the introduction of the Neural Engine NPU in its A11 mobile chipset, followed by Huawei's Kirin 970 system on a chip in 2018. Qualcomm, a dominant player in the Android mobile platform, integrated its AI Engine into premium 800 series chipsets.

Recent advancements, such as Qualcomm's focus on on-device generative AI in the Snapdragon 8 Gen 3, highlight the continued integration of NPUs into the latest offerings from MediaTek and Samsung. This widespread adoption underscores the pivotal role NPUs play in shaping the future of AI in smartphones.