IT Services

Intel: Microsoft's Copilot AI Could Run Locally on Future AI-Enabled PCs

By Business OutstandersPUBLISHED: March 29, 17:06
Microsoft Intel

According to Intel, future AI-enabled PCs may have the capability to process elements of Microsoft's Copilot AI directly on the machine rather than relying on cloud computing. These PCs would require neural processing units (NPUs) exceeding 40 trillion operations per second, a performance level not currently achieved by any consumer processor.

Intel stated that enhanced local computing power through NPUs could reduce delays experienced when using Copilot, especially for minor requests that currently depend predominantly on cloud processing. This is expected to boost Copilot's performance and privacy by handling more tasks directly on the PC.

Recent advancements in hardware like NPUs included in Intel's upcoming Meteor Lake chips and AMD's offerings have led to discussions around AI-powered PCs. These dedicated, low-power NPUs aim to facilitate local execution of generative AI models, improving AI processing efficiency. NPUs are anticipated to become standard in future PCs, enabling generative AI tasks to operate seamlessly in the background even on battery power.

For example, MSI's latest laptops recognize activities and automatically adjust settings like battery, fans, and screen to optimize performance or conserve resources based on tasks like gaming or document work. The trend toward local AI is also occurring in smartphones, as devices like Google's upcoming Pixel 8 line include AI chips designed to support on-device generative functions currently limited to more basic models.

One benefit of local AI cited is enhanced cybersecurity and data control versus cloud-based solutions. Cybersecurity consultant John Bambenek noted the risk of data loss or access issues when intellectual property tasks integrate AI through the cloud. Being able to process AI locally could address many organizations' largest adoption barriers by keeping data under their control.​