Intel made its first statement on the metaverse on Tuesday — its first public acknowledgement of that sometimes-nebulous future of computing which promises an always connected virtual world that exists in parallel with our physical one. But while the chip company is bullish on the possibilities of the metaverse in abstract, Intel raises a key issue with realizing any metaverse ambitions: there’s not nearly enough processing power to go around.
“The metaverse may be the next major platform in computing after the world wide web and mobile,” an editorial begins from Raja Koduri, a senior vice president and head of Intel’s Accelerated Computing Systems and Graphics Group. But Koduri quickly pours cold water on the idea that the metaverse is right around the corner: “our computing, storage and networking infrastructure today is simply not enough to enable this vision,” he writes. Crucially, Koduri doesn’t even think we’re close. He says that a 1,000x increase in power is needed over our current collective computing capacity.
A lot of the metaverse hype has been built around what you’ll do there, be it virtual reality meetings, digital concerts, and of course, blockchain and NFT-based integrations. And there’s plenty of excitement about the future of virtual and augmented reality headsets, too, whether it be Meta’s Quest products (formerly known as Oculus) or Apple’s long-rumored headset.
But the actual building blocks of the metaverse aren’t just going to be software and virtual spaces (which, of course, is its own fight, given that today’s digital worlds are extremely self-contained) or even the headsets and gadgets people wear to “get” there. It’ll be in the computers and servers that run the vast shared virtual worlds the metaverse posits as the future of technology. And it’s there that Intel has the biggest reality check: today’s computers are just simply not powerful enough to make those dreams a reality. They’re not even close.
On the one hand, the statement here is almost laughably obvious. Meta’s flagship VR space, Horizon Worlds, maxes out at 20 participants for a space, and that’s for basic, Roblox-style animated worlds. The state of the art in VR still requires thousands of dollars of PC gaming hardware, with plenty of drawbacks (like requiring a tethered headset and graphics that still don’t measure up to what 2021’s best flatscreen games can offer). And even the biggest traditional video games that aren’t dealing with the added demands of VR like Fortnite or Battlefield 2042 can only handle up to 100 to 128 players at a time.
As Koduri notes in his editorial, we can’t even put two people in a truly detailed virtual environment with today’s technology. “Consider what is required to put two individuals in a social setting in an entirely virtual environment: convincing and detailed avatars with realistic clothing, hair and skin tones – all rendered in real time and based on sensor data capturing real world 3D objects, gestures, audio and much more; data transfer at super high bandwidths and extremely low latencies; and a persistent model of the environment, which may contain both real and simulated elements.”
And that’s just for two people — scaling up to the hundreds of millions of users that a Ready Player One, Snow Crash, or Matrix-style metaverse concept would require much, much more computing infrastructure.
Of course, Intel also has a vested interest in saying that we need more and better computers and servers. After all, Intel makes CPUs (and soon, GPUs) for consumer devices and data centers alike. And if the metaverse — the hottest buzzword technology of the future — needs a literal 1,000x increase in computing capacity, well that’s just good for business. It’s no coincidence that Intel explicitly called out both its client compute and cloud processors and graphics products in its metaverse brief.
The problem, though, is that even Intel doesn’t think that the hardware alone is going to get us to 1,000x. As Koduri explained in an interview with Quartz, “We believe that a standard kind of Moore’s Law curve is only going to get us to about eight or 10x growth over the next five years.” (Moore’s Law generally is defined as computational capacity doubling every two years, which tracks with the eight to 10x growth that Koduri predicts.)
Instead, Koduri is optimistically forecasting that algorithms and software improvements will make up the gap. Things like machine learning-powered neural nets, or AI-enhanced computational techniques of the sort that Intel already is using for things like its Deep Link technology or the upcoming XeSS super sampling it’s planning to debut with its Arc GPUs early next year. It’s a big ask, though — Intel is counting on algorithms or AI to offer a hundredfold (or more) improvement in computing capacity, all on top of the growth offered by its existing hardware roadmap.
Koduri notes in the same Quartz interview that improved software and algorithms won’t just be necessary to close the gap in the ambitious five-year timeframe he lays out; they’ll be crucial to helping mitigate the increased energy consumption that trying to brute force the problem would create, something that he compares to the current problems with cryptocurrency mining today.
It’s easy to just wave a hand and say that software will fill in any gaps hardware leaves behind (especially for a company like Intel, which primarily makes the hardware). Plenty of major tech companies have flocked to the idea that AI and machine learning will solve their computation issues, for everything from making smartphone cameras better to offering upscaled gaming visuals, and it’s appealing to think that they might. But it still seems like a tall order to rely on them to 100x tomorrow’s computing, which is forecast to only see a 10x jump based on hardware improvements alone.
The fact that Intel is thinking about all this now — and stating the problem — is an encouraging sign, though. It’s easy to ride the hype and start pitching fantastical ideas of selling NFTs that will follow you from place to place in different games and virtual settings. Beefing up server infrastructure and working to reduce latency is less sexy; but as Intel’s presentation shows, if the metaverse is ever going to reach its sci-fi ambitions, there’s a lot more foundational work that needs to be done in the coming years to pave the road.