2022 in GPUs: The shortage ends, but higher prices seem here to stay

graphics novel — 2022 in GPUs: The shortage ends, but higher prices seem here to stay Intel Arc joins the fray, and Nvidia charges big money for big performance.

Andrew Cunningham – Dec 27, 2022 2:25 pm UTC Enlarge / From left to right and largest to smallest: GeForce RTX 4080 (which is the same physical size as the RTX 4090), Radeon RX 7900 XTX, and Radeon RX 7900 XT.Andrew Cunningham reader comments 13 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit

In 2021, the biggest story about GPUs was that you mostly just couldn’t buy them, not without paying scalper-inflated prices on eBay or learning to navigate a maze of stock-tracking websites or Discords.

The good news is that the stock situation improved a lot in 2022. A cryptocurrency crash and a falloff in PC sales reduced the demand for GPUs, which in turn made them less profitable for scalpers, which in turn improved the stock situation. It’s currently possible to visit an online store and buy many GPUs for an amount that at least gets kind-of-sort-of close to their original list price. Further ReadingA quick look at AMDs Radeon RX 7900 XTX, which is smaller than an RTX 4080

We also saw lots of new GPU launches in 2022. The year started off less-than-great with the launch of 1080p-focused, price-inflated cards like Nvidia’s RTX 3050 and AMD’s inspiringly mediocre RX 6500 XT. But by the end of the year, we received Nvidia’s hugely expensive but hugely powerful RTX 4090 and RTX 4080 cards, AMD’s less-monstrous but still competitive RX 7900 series, and Intel’s flawed but price-conscious Arc A770 and A750 cards.

The bad news is that the aftereffects of the GPU shortage still linger, mainly in the form of inflated prices. We can hope that these come down in 2023, but so far, there is little sign of that happening. Budget GPUs are in a sad state

You can still find GPUs at and under $200 if you’re looking for basic, better-than-integrated performance in older and lower-end games that you’ll mostly run at or below 1080p.

But performance in this category has movedverylittle over the last three or four years. Nvidia seems content to serve this low-end slice of the gaming market with the same GeForce GTX 1650 GPU it introduced in 2019, a card that continues to stubbornly hover in the $150-to-$200 price window despite its age.AMD and Intel have both released new cards for the sub-$200 market in the last year, and those cards can sometimes beat the GTX 1650’s performance. But these cards are also flawed in some hard-to-ignore ways. Advertisement Further ReadingThe reviews are in: AMDs mining-averse RX 6500 XT also isnt great at gaming

AMD’s RX 6500 XT was originally a laptop GPU that was adapted for desktops, and as a result, it supports fewer displays than other GPUs in the RX 6000 series, it’s missing hardware video encoding support, and its performance in older PCI Express 3.0-capable PCs is poor because it only provides four lanes of PCIe bandwidth in the first place. Intel’s Arc A380 has great video encoding support (including for the AV1 video codec), but like other Arc cards, its drivers are rough around the edges, and performance in older games can be spotty.

If superior GPUs like Nvidia’s RTX 3050 series and AMD’s RX 6600 series drop down into the $200 range soon, we’ll be feeling a lot better about the state of budget GPUs. But those GPUs still tend to hover around the $300 mark, a relatively small but still significant increase for people trying to put together a budget build. For the best GPUs, $1,000 and up is the new normal Enlarge / Nvidia’s hefty RTX 4090 GPU also has a hefty $1,600 price tag, and you’ll pay more than that to actually buy the GPU right now. Sam Machkovech

Shifting to the other extreme of the market, it used to be that GPUs with four-digit price tags were mostly ignorable by normal people. Halo products like Nvidia’s Titan GPUs performed well, sure, but why pay all that money when cheaper xx80 and xx70-series GPUs could give you a large percentage of that performance for a small percentage of the price?

The story of this generation’s midrange cards still has yet to be written, but so far, the high-end cards in the RTX 4000 series and the RX 7000 series have been listed for much more than their predecessors. The $1,200 RTX 4080 is a huge jump from the $700 Nvidia originally advertised for the RTX 3080 and 2080. If the newly rebranded 4070 Ti launches at $900 as is currently rumored (and was originally planned, back when it was called the “RTX 4080 12GB”), that will be a substantial hike over the $500-to-$600 launch prices of cards like the RTX 2070, RTX 3070, and RTX 3070 Ti. Advertisement

All of these prices are made much worse by the fact that you can’t find RTX 4080 or 4090 cards anywhere close to their launch MSRPs right now.

Things look marginally less exorbitant on AMD’s side, where the top-tier RX 7900 XTX launched for the same $999 price as last-generation’s top-tier RX 6900 XT did. But where the 6900 XT was accompanied by an RX 6800 XT that provided most of the performance for $649, the 7900 XTX’s little sibling is an $899 RX 7900 XT that actually gives you less performance-per-dollar than the XTX. Both cards still undercut Nvidia’s pricing for the RTX 4000 series, but that’s more about how expensive the 4090 and 4080 are than it is about the value AMD is providing. Further ReadingIs Moores law actually dead this time? Nvidia seems to think so

Nvidia CEO Jensen Huang said that these higher prices are here to stay, courtesy of the increased costs associated with designing and manufacturing these GPUs. Obviously, Huang is not an impartial observer herehe is materially invested in keeping the prices of his GPUs high, especially as Nvidia’s finances have suffered. But there’s some truth to what he’s saying: cutting-edge manufacturing processes are expensive, Nvidia is fighting AMD, Apple, and all kinds of other chip designers for TSMC’s production capacity, and gigantic monolithic chips like the RTX 4000 GPUs are going to have lower yields than smaller, less-complex processor dies. Page: 1 2 Next → reader comments 13 with 0 posters participating Share this story Share on Facebook Share on Twitter Share on Reddit Andrew Cunningham Andrew is a Senior Technology Reporter at Ars Technica with over a decade of experience in consumer tech, covering everything from PCs to Macs to smartphones to game consoles. His work has appeared in the New York Times’ Wirecutter and AnandTech. He also records a weekly book podcast called Overdue. Email andrew.cunningham@arstechnica.com // Twitter @AndrewWrites Advertisement Channel Ars Technica ← Previous story Related Stories Today on Ars