2024 Calendar
TechTalk Daily

Do NPUs Mean the Death of GPUs for AI?

Do NPUs Mean the Death of GPUs for AI?

Rob Enderle for Techspective 

AMD, Intel and Qualcomm are building NPUs (neural processing units) into their smartphone and PC solutions. Microsoft has already created an NPU, and OpenAI is talking about creating its own NPU, as well.

NPUs are Neural Processing Units that are specifically focused on handling AI workloads. So, does that mean GPU (graphics processing unit) solutions are, or soon will be, obsolete?

Well, just like integrated graphics didn’t take out discrete graphics, NPUs (with the possible exception of those created specifically for certain AI engines like the one OpenAI is thinking about building) are unlikely to displace GPUs as the AI engine of choice.

Let me explain.

NPU Advantage

The advantage an NPU has over a GPU is one of focus and efficiency. The NPU is designed to provide a base level of AI support extremely efficiently but is generally limited to small, persistent solutions like AI assistants. This makes them very useful in smartphones and laptops, which will increasingly use AI interfaces but will still need to have long battery life.

Much like integrated graphics, they may be more than adequate in many solutions, and they will always have an advantage in terms of energy efficiency, but they aren’t currently high-performing. This means larger models or more advanced AIs will probably not run well, if at all, on them, much like CAD applications and other graphics-intensive programs require GPUs to operate.

At least for the near future, expect them to be focused mostly on inference applications using relatively small models.

GPU Advantage

GPUs have far more headroom, but they come with energy penalties. Far better for training and large language models, GPUs aren’t going anyplace and will continue to be favored where performance is more important than energy use, particularly where you need to train a model in a reasonable period.

You can even imagine both technologies being in place in the same hardware with the NPU, handling the light AI loads and particularly those that need to be persistent and always available. GPUs will be called up, when available, for projects that use larger, more complex models, particularly when you move from inference to training.

In effect, GPUs and NPUs can be used together…


Read the full article to learn more about how GPUs and NPUs can be used together: Do NPUs Mean the Death of GPUs for AI?

Rob Enderle, The Enderle Group
An Internet search of media quotes validates Rob Enderle as one of the most influential technology pundits in the world. Leveraging world-class IT industry analysis skills honed at DataQuest, Giga Information Group, and Forrester Research, Rob seized upon the power of the information channel as a conduit to reach business strategists and deliver valuable, experienced-based insight on how to leverage industry advances for maximum business advantage.

Interested in AI? Check here to see what TechTalk AI Impact events are happening in your area.