Saturday, May 24, 2025
spot_imgspot_img

Top 5 This Week

spot_img

Related Posts

The AI Goldrush Spawns a New Breed of PC: All Hail the Personal Supercomputer

AI is everywhere, but for most of us, it’s in the cloud. That’s a problem for businesses and individuals who want to use AI without the internet. As AI’s prominence has grown, the need for standalone systems that can handle the power to run large language models (LLMs) and other models on personal systems has only increased. But those models demand levels of performance that most PCs can’t handle—that was until Nvidia announced Project DIGITS.

According to Nvidia CEO Jensen Huang, the goal of Project DIGITS is “placing an AI supercomputer on the desks of every data scientist, AI researcher, and student.” This way, individuals can access hardware (Nvidia Grace Blackwell Superchips) to prototype, fine-tune, and run their AI models locally. It’s an AI supercomputer packed into a PC form factor.

Nvidia DGX Spark

(Credit: Nvidia)

Since then, Nvidia has announced the (renamed) DGX Spark, a mini PC-style system initially teased alongside a larger desktop workstation called DGX Station. At Computex 2025, several other manufacturers also introduced their own specialized AI desktops, so let’s take a closer look at this wave of new AI-crunching desktops and see where they’ll take us.


Nvidia’s DGX Spark: A Mini PC With Massive AI Muscle

The most accessible of these new AI-centric PCs is the Nvidia DGX Spark, a compact mini PC outfitted with a GB10 super chip combining a Blackwell GPU with a 20-core Arm-based Grace CPU. The GPU portion has 6,144 of Nvidia’s CUDA cores, the same number found in the Nvidia GeForce RTX 5070 graphics card but optimized for AI computing. Nvidia pairs this with 128GB of LPDDR5x system memory, shared between the CPU and GPU, and 4TB of storage. With that hardware, the diminutive DGX Spark will produce 1,000 trillion operations per second (TOPS) when working on AI tasks, vastly outperforming consumer AI PCs‘ 40-50 TOPS.

Nvidia DGX Spark exploded diagram

(Credit: Nvidia)

The DGX Spark runs Nvidia’s DGX OS, a version of Ubuntu Linux explicitly built for running AI models on these little machines. In addition to drivers for the Nvidia hardware inside, this OS can run all sorts of models (like those found on Hugging Face), and AI tools like Python, PyTorch, and Jupyter notebooks, along with Nvidia’s extensive catalog of AI software and developer tools.

These small systems can also scale up thanks to Nvidia’s NVLink-C2C interconnect technology, which lets you connect a pair of DGX Spark units. Nvidia claims that a single DGX Spark can run models with up to 200 billion parameters, while a pair of Sparks can handle up to 405 billion parameters.

Nvidia currently has the DGX Spark available for preorder for $3,999, but even that price is up in the air, as specifics are subject to change before an eventual launch later this year. Nvidia is far from the only kid in this new AI mini PC playground, as it has now partnered with several manufacturers to produce their own GB10-powered Blackwell boxes.


“Placing an AI supercomputer on the desks of every data scientist, AI researcher, and student empowers them to engage and shape the age of AI.”

– Nvidia CEO Jensen Huang


Sparks Catching Fire: More Blackwell Mini PCs Incoming

Even though the DGX Spark is an Nvidia product built with Nvidia parts, the company won’t be alone in making these AI mini PCs. Similar GB10-powered models have appeared from Asus, Dell, Gigabyte, HP, Lenovo, and MSI on the Computex 2025 showfloor.

Asus Ascent GX10

(Credit: John Burek)

Here’s the rundown of each company’s models and features. For most of these, the basics are the same. These pint-sized systems feature the same GB10 super chip, 128GB of unified memory, and ConnectX-7 networking. They also all run on Nvidia’s DGX OS. 

MSI EdgeXpert MS-C931

(Credit: John Burek)

  • Asus Ascent GX10

  • Dell Pro Max with GB10

  • Gigabyte AI TOP Atom

  • HP ZGX Nano AI Station

  • Lenovo ThinkStation PGX

  • MSI EdgeXpert MS-C931

So far, the differences seem mostly cosmetic, with each manufacturer bringing its own take on the compact NUC-like design. Specifics about storage, port selection, other features, and pricing information are mostly unknown.


Nvidia’s mini PC may be impressive, but for people who want to tackle even more demanding AI workloads, Nvidia has its DGX Station. This is a beefy AI workstation with an Nvidia GB300 Grace Blackwell Ultra Desktop Superchip, cramming server-grade Grace Blackwell hardware into a single motherboard. The system has a 72-core Nvidia Grace CPU with 784GB of unified memory, expandable storage with multiple M.2 and PCIe Gen5 slots, and a ConnectX-8 Super Network Interface Card (SuperNIC) for daisy-chaining additional units.

Nvidia Jensen Huang introducing DGX Station

(Credit: Michael Kan)

In his recent Computex keynote, Nvidia CEO Jensen Huang said that the DGX Station provides “the most performance you can possibly get out of a wall socket.” And it’s a lot of power. The GB300 produces up to 20 petaflops of AI performance—that’s 20 thousand TOPS. That’s ample power to run most models available locally, without connecting to the cloud or a remote server.

With expansion slots, multi-instance GPU support (up to 7 partitions), and 800GBps networking, you can beef it up even more. That’s the power needed to build AI tools, train AI and machine learning models, and run them on your desk.

Get Our Best Stories!

Newsletter Icon

Your Daily Dose of Our Top Tech News

Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Nvidia hasn’t even hinted at a price for the DGX Station yet (we estimate $10,000 or more) and will only commit to saying it will be available later this year.

Nvidia's 2025 Computex Press Conference Highlights: Everything Revealed in X Minutes

PCMag Logo Nvidia’s 2025 Computex Press Conference Highlights: Everything Revealed in X Minutes


Man Your Stations! Multiple DGX Stations On the Way

Like the DGX Spark, Nvidia partnered with several desktop PC makers to provide different versions of the DGX Station. Partners like Asus, Dell, and HP have already announced a batch of systems with identical specs: an Nvidia GB300 Grace Blackwell Ultra Desktop Superchip, 784GB of unified system memory (up to 288GB HBME3e GPU memory and 496GB of LPDDR5X CPU memory), and a ConnectX-8 SuperNIC for clustering multiple DGX Stations together.

Here are the systems we’ve seen announced so far…

…with more desktops coming from Gigabyte, MSI, Lenovo, Boxx, and Lambda.

Nvidia DGX Spark and DGX Station

(Credit: Nvidia)


The Rise of the Personal Supercomputer

These AI boxes aren’t room-filling systems and won’t match the fastest or most powerful industrial setups running today’s most significant AI projects. However, these new AI workstations have computational power and architecture that used to be limited to large-scale, data center-class supercomputers. 

Recommended by Our Editors

With petascale compute, server-grade hardware, heavy-duty parallel processing, massive amounts of memory, specialized AI development software, and the option for interconnected systems, these AI workstations meet most of the defining elements of a supercomputer. These systems represent the birth of a new category: the personal supercomputer.

Nvidia head Jensen Huang, who has become the face of this new desktop category, says, “Placing an AI supercomputer on the desks of every data scientist, AI researcher, and student empowers them to engage and shape the age of AI.”

That’s what these new systems are all about, and once that vision is achieved, the AI hype might finally pay off. Using AI may be a rarified niche now, but it won’t stay that way. Today’s cutting edge becomes tomorrow’s baseline, and as powerful AI works its way into more devices and apps, the hardware that drives it all will go from niche to mainstream.


Decentralizing AI: Why Local AI Power Matters Now

AI is far more than conversational chatbots like ChatGPT and Microsoft’s Copilot. Those large language models (LLMs) are one form of generative AI that can work with text or even generate images. But AI also extends to other areas: Convolutional Neural Networks (CNNs) for computer vision and image analysis; Large Action Models (LAMs) for robotics and automation; Graph Neural Networks (GNNs) for scientific research and complex systems modeling; generative models that can render video, make music, or generate 3D models for prototyping; and Agentic, action-oriented models that can make decisions and automate workflows. These areas will all grow and evolve as more developers get the hardware to build and fine-tune these models locally.

The frontiers of computing technology right now are focused on applying machine learning and AI to some of our most complex problems and developing features for the apps we use every day. Whether in media, gaming, or daily work, these minor improvements to the tools we use daily will require heavy-duty processing power. These are just the first systems to deliver to package the raw power needed for AI development into a single, standalone desktop PC. 

But it won’t end there—not by a long shot. Not only do mini PCs already have this level of hardware, but Gigabyte just announced a new line of gaming/AI desktops, and Dell just dropped its first AI workstation laptop. As professional tools scale up, the new thresholds of power and capability trickle down into consumer systems. That’s been the story of technology for decades.

The bigger picture here isn’t just that businesses are getting new systems—a new class of system, really—capable of working with this new technology. We’re getting our first glimpses at the sort of power that will be mainstream soon. It’s so much power that we’re still figuring out how to use it. We’re even still working out how to provide it with the needed electricity. And still, we’re grappling with how to find and ethically use the raw data required to train these models and fine-tune them. AI is still largely unexplored territory, but we’re about to get equipment that will democratize access to the tools behind it in short order.

If you’re a professional looking at anything AI-related, today’s exciting news is these new enterprise-level AI development boxes. But, everyone will have access to so-called personal supercomputers sooner rather than later. Once these products come to market, we’ll be on top of testing whatever we can get our hands on.

About Brian Westover

Lead Analyst, Hardware

Brian Westover

If you’re after laptop buying advice, I’m your man. From PC reviews to Starlink testing, I’ve got more than a decade of experience reviewing PCs and technology products. I got my start with PCMag but have also written for Tom’s Guide and LaptopMag.com, and several other tech outlets. With a focus on personal computing (Windows, macOS, and ChromeOS), Starlink satellite internet, and generative AI productivity tools, I’m a professional tech nerd and a power user through and through.


Read Brian’s full bio

Read the latest from Brian Westover

Facebook Comments Box

Popular Articles

Close