Skip to main content

Nvidia and Microsoft are solving a big problem with Copilot+

The Surface Laptop running local AI models.
Luke Larsen / Digital Trends

When Microsoft announced Copilot+ PCs a few weeks back, one question reigned supreme: Why can’t I just run these AI applications on my GPU? At Computex 2024, Nvidia finally provided an answer.

Nvidia and Microsoft are working together on an Application Programming Interface (API) that will allow developers to run their AI-accelerated apps on RTX graphics cards. This includes the various Small Language Models (SLMs) that are part of the Copilot runtime, which are used as the basis for features like Recall and Live Captions.

With the toolkit, developers can allow apps to run locally on your GPU instead of the NPU. This opens up the door to not only more powerful AI applications, as the AI capabilities of GPUs are generally higher than NPUs, but also the ability to run on PCs that don’t currently fall under the Copilot+ umbrella.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

It’s a great move. Copilot+ PCs currently require a Neural Processing Unit (NPU) that’s capable of at least 40 Tera Operations Per Second (TOPS). At the moment, only the Snapdragon X Elite satisfies that criteria. Despite that, GPUs have much higher AI processing capabilities, with even low-end models reaching to 100 TOPS, and higher-end options scaling even higher.

In addition to running on the GPU, the new API adds retrieval-augmented generation (RAG) capabilities to the Copilot runtime. RAG gives the AI model access to specific information locally, allowing it to provide more helpful solutions. We saw RAG on full display with Nvidia’s Chat with RTX earlier this year.

Performance comparison with the RTX AI toolkit.
Nvidia

Outside of the API, Nvidia announced the RTX AI Toolkit at Computex. This developer suite, arriving in June, combines various tools and SDKs that allow developers to tune AI models for specific applications. Nvidia says that by using the RTX AI Toolkit, developers can make models four times faster and three times smaller compared to using open-source solutions.

We’re seeing a wave of tools that enable developers to build specific AI applications for end users. Some of that is already showing up in Copilot+ PCs, but I suspect we’ll see far more AI applications at this point next year. We have the hardware to run these apps, after all; now we just need the software.

Editors' Recommendations

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
The real reason behind Copilot+ PCs goes far beyond just AI
The new Surface Pro on a table.

Microsoft has a lot more than AI riding on Copilot+ PCs. Although AI is the current buzzword of the tech industry, Microsoft's push into a new era of PCs has just as much to do with declining PC sales over the past several years, as well as Microsoft's decade-long drive to get Windows on ARM working.

With so much going on, it's left me wondering what Microsoft's real reason and motivation behind the transition might be. Copilot+ PCs are a new category of device that, yes, come with some AI features, but I'm convinced this transition might have more to do with addressing a stagnant Windows laptop market than simply just AI.
A simple question

Read more
All the Copilot updates announced at Build 2024
A Team Copilot being used alongside a Teams video call.

It’s that time of year again, and Microsoft is making various announcements regarding Copilot at its annual Build developer conference. As expected, AI is a massive part of what’s being said, just like last year.

Perhaps the biggest announcement in that regard was that GPT-4o was already live in Azure AI and would soon be coming to Copilot. It was mentioned as part of the Copilot+ press event yesterday, but not much information was provided, aside from the Minecraft tutorial demo.

Read more
Microsoft has a killer answer for DLSS with Copilot+
Portal RTX running on the Surface Laptop Studio 2.

Microsoft might be able to put out the Nvidia Deep Learning Super Sampling (DLSS) killer that companies like AMD and Intel have been gunning for. A new feature included with Copilot+ PCs is Auto SR, which is an AI-assisted upscaling tool similar to DLSS. It's exclusive to Copilot+ PCs for now, but Microsoft is leaving the door open for other platforms.

Although Microsoft hasn't confirmed that Auto SR will run on the Neural Processing Unit (NPU) inside Copilot+ PCs, the fact that it's an exclusive feature for now provides a pretty strong hint. On the Copilot+ page, Microsoft also says that Auto SR is only available in "a curated set of games" at release, suggesting that the feature requires per-game integration.

Read more