What happens if in a couple years from now, NVIDIA decides that the opportunity cost of manufacturing gaming GPUs instead of data center GPUs becomes so high that they decide to step out of the gaming hardware business altogether, what happens to PC gaming?
NVIDIA and AMD both use TSMC to manufacture the GPUs, so if NVIDIA made that decision, presumably, AMD would do so as well. That basically leaves Intel, but their GPUs are lackluster and the drivers are awful.
Gaming PCs have become ludicrously expensive. Even mid-grade builds are $2,000. You can't really build a decent gaming PC for under $1,500 if you want to play the latest AAA games.
dixie_land11 hours ago[-]
I'm a gamer too and I worry that another likely scenario is all gaming moves to the cloud streaming model for maximum rent seeking :(
bigbadfeline1 hours ago[-]
> I worry that another likely scenario is all gaming moves to the cloud streaming model for maximum rent seeking
Precisely, what we're seeing is hoarding of hardware to starve the consumer space of advanced computing resources, create a cartelized and monopolized market and not only extract hyper-profits but also remove the freedom of independent technology use or advancement.
trashface11 hours ago[-]
Game devs will have to optimize their code more, like in the old days. But GPUs are really powerful nowadays, so even scaled back GPUs are still really strong.
I don't expect them to go away entirely, if AMD or NVIDIA step back from it, there is still a market there, someone will fill it. Really doubt AMD would do that anyway as they don't have all the AI related sales to replacing gaming.
Sohcahtoa8211 hours ago[-]
> there is still a market there, someone will fill it.
How? The cost to design a GPU from scratch is astronomical. Even if a bunch of top brass engineers from NVIDIA left the company to start their own, I'd be surprised if trying to apply as much of their knowledge as possible didn't result in patent violations.
Even ignoring the design part, you have to deal with actual manufacturing. TSMC is ostensibly the only guys in town that could make it, but their fabs are already occupied by NVIDIA's orders. Building your own fab is billions of dollars and several years.
ElevenLathe10 hours ago[-]
I think we've topped out on how good GPUs realistically need to be. The games industry is embroiled in layoffs. AAA games as we knew them are ending because they're just not compatible with our current macroeconomic situation. My current GPU is an RX590 (released in 2018, I bought it new in 2019) and I have no plans to upgrade any time soon.
Realistically, I think that if AMD and NVIDIA both abandon the gaming market entirely (unlikely, IMO) then Intel or some Chinese no-name brand will pick up the slack. Video cards for gaming don't need to be bleeding edge any more. 99.999% of experiences gamedevs want to create don't require that horsepower, and consumers can't afford it anyway (at least not when effectively bidding for fab capacity against billionaires who are convinced they can be immortal Gods if they win the auction). Most likely, no-name chips from non-frontier fabs (or by Chinese fabs trying to claw their way upmarket) will be badge-engineered as AMD/Nvidia/Intel.
Alternatively, we have WW3 and either advanced, PC-gaming civilization ends completely, or we at least have to sacrifice consumer goods like video cards for the duration.
jerlam10 hours ago[-]
There isn't as much money as you think in AAA games, especially since most games are multi-platform. Those AAA PC games also must run on Playstation or XBox, and people can't upgrade those consoles. The PS5 is five years old already. Are there any popular games with a system requirement greater than a 2060?
Having the latest gaming PC is more of a hobby than a requirement.
Sohcahtoa827 hours ago[-]
> Are there any popular games with a system requirement greater than a 2060?
Requirement, probably not. You can run Cyberpunk on minimal settings on a 2060. But on Maximum settings in 4K with HDR, even an RTX 5090 requires DLSS frame gen. But it looks stunning.
I think PC gamers have a higher standard when it comes to graphical quality and performance. Many console gamers have been convinced that 30 fps is fine. Meanwhile, PC gamers are likely on 144 hz or higher monitors and expect 144+ fps.
alpineman8 hours ago[-]
Maybe that's their plan. They launched the NVIDIA GeForce now cloud on-demand service, after all.
porridgeraisin11 hours ago[-]
I have quite some hope that the steam machine will fill all the gaps. Yes, they are hit by the cost increase as well, but the scale is completely different when you compare it to the latest beefy nvidia GPU. It has a good chance.
Sohcahtoa827 hours ago[-]
If NVIDIA and AMD aren't making GPUs, who's going to make the GPU for the Steam Machine?
2OEH8eoCRo011 hours ago[-]
I remember the 90s when a normal PC was $3,000 or more.
Sohcahtoa8210 hours ago[-]
In the mid 90s, sure, but prices were coming down fast.
There was a sweet spot in ~1999 where you could buy an A-Bit BH6 motherboard, 300 Mhz Celeron A (overclocked to 450 Mhz), 3dfx Banshee, and all the other components to make a PC for only ~$500 and have a respectable gaming machine.
expedition3213 hours ago[-]
At least if you buy a 5080 today I guarantee you that it will run triple A games for the next 5 years.
Back in the 90s hardware aged like lettuce.
leosanchez13 hours ago[-]
Lettuce might not be a good comparison. There are some Prime Ministers that couldn't outlast a lettuce :)
What happens if in a couple years from now, NVIDIA decides that the opportunity cost of manufacturing gaming GPUs instead of data center GPUs becomes so high that they decide to step out of the gaming hardware business altogether, what happens to PC gaming?
NVIDIA and AMD both use TSMC to manufacture the GPUs, so if NVIDIA made that decision, presumably, AMD would do so as well. That basically leaves Intel, but their GPUs are lackluster and the drivers are awful.
Gaming PCs have become ludicrously expensive. Even mid-grade builds are $2,000. You can't really build a decent gaming PC for under $1,500 if you want to play the latest AAA games.
Precisely, what we're seeing is hoarding of hardware to starve the consumer space of advanced computing resources, create a cartelized and monopolized market and not only extract hyper-profits but also remove the freedom of independent technology use or advancement.
I don't expect them to go away entirely, if AMD or NVIDIA step back from it, there is still a market there, someone will fill it. Really doubt AMD would do that anyway as they don't have all the AI related sales to replacing gaming.
How? The cost to design a GPU from scratch is astronomical. Even if a bunch of top brass engineers from NVIDIA left the company to start their own, I'd be surprised if trying to apply as much of their knowledge as possible didn't result in patent violations.
Even ignoring the design part, you have to deal with actual manufacturing. TSMC is ostensibly the only guys in town that could make it, but their fabs are already occupied by NVIDIA's orders. Building your own fab is billions of dollars and several years.
Realistically, I think that if AMD and NVIDIA both abandon the gaming market entirely (unlikely, IMO) then Intel or some Chinese no-name brand will pick up the slack. Video cards for gaming don't need to be bleeding edge any more. 99.999% of experiences gamedevs want to create don't require that horsepower, and consumers can't afford it anyway (at least not when effectively bidding for fab capacity against billionaires who are convinced they can be immortal Gods if they win the auction). Most likely, no-name chips from non-frontier fabs (or by Chinese fabs trying to claw their way upmarket) will be badge-engineered as AMD/Nvidia/Intel.
Alternatively, we have WW3 and either advanced, PC-gaming civilization ends completely, or we at least have to sacrifice consumer goods like video cards for the duration.
Having the latest gaming PC is more of a hobby than a requirement.
Requirement, probably not. You can run Cyberpunk on minimal settings on a 2060. But on Maximum settings in 4K with HDR, even an RTX 5090 requires DLSS frame gen. But it looks stunning.
I think PC gamers have a higher standard when it comes to graphical quality and performance. Many console gamers have been convinced that 30 fps is fine. Meanwhile, PC gamers are likely on 144 hz or higher monitors and expect 144+ fps.
There was a sweet spot in ~1999 where you could buy an A-Bit BH6 motherboard, 300 Mhz Celeron A (overclocked to 450 Mhz), 3dfx Banshee, and all the other components to make a PC for only ~$500 and have a respectable gaming machine.