We typically access AI programs, such as chatbots and image generators, from powerful data centers hooked up to the internet. But Nvidia is now working with Microsoft to let consumers run the same programs on Windows PCs equipped with its RTX graphics.
The advancements announced today are geared toward software developers who want to build and release AI programs for Windows PCs. But if all goes well, consumers could one day run an AI image generator on a laptop without connecting to the internet, says Manuvir Das, Nvidia’s VP for Enterprise. “This is a paradigm changing model. This is like bringing graphics to the PC,” he adds.
(Credit: Nvidia)
Nvidia and Microsoft are taking advantage of Windows’ ability to run the Linux operating system. The so-called Windows Subsystem for Linux (WSL) used to be a hassle to install, but Microsoft has been streamlining the process.
Nvidia now says it’s been working to deliver “GPU acceleration and support” for the company’s entire AI software stack inside WSL. As a result, developers can stop using a pure Linux machine, the favored development platform for artificial intelligence programs, and instead rely on their own Windows PC for local AI software building—as long as the same machine is equipped with a powerful graphics card.
The other way the companies are helping AI software developers is through the “Microsoft Olive toolchain,” which can help integrate AI models into Windows applications by optimizing and exporting Pytorch deep learning models to the ONNX format.
Nvidia added: “On May 24, we’ll release our latest optimizations in Release 532.03 drivers that combine with Olive optimized models to deliver big boosts in AI performance.”
(Credit: Nvidia)
As an example, Nvidia showed a benchmark of the new drivers running the AI image generator Stable Diffusion, which can already run on a Windows PC. The new drivers can boost the performance by over two times.
Recommended by Our Editors
On top of all this, Nvidia is also preparing a software optimization that’ll help AI programs better run on laptops. “Coming soon, Nvidia will introduce new Max-Q low-power inferencing for AI-only workloads on RTX GPUs,” the company said.” It optimizes Tensor Core performance while keeping power consumption of the GPU as low as possible, extending battery life and maintaining a cool, quiet system. The GPU can then dynamically scale up for maximum AI performance when the workload demands it.”
Time will tell if the software improvements can create a strong foundation to help developers build AI programs for Windows PCs. The limitations of Nvidia RTX GPUs running these AI programs on a PC is also unclear. Still, Nvidia is bullish about the possibilities.
“Our goal at Nvidia is to bring accelerated computing to everything, everywhere, so that everyone can benefit from it,” Das added.
Get Our Best Stories!
Sign up for What’s New Now to get our top stories delivered to your inbox every morning.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Hits: 0