

15·
6 days agoBasically every one of them made in the past 4 or 5 years?
Some are better than others - CP2077, for example, will happily use all 16 threads on my 7700x, but something crusty like WoW only uses like, 4. Fortnite is. 3 or so, unless you’re doing shader compilation where it’ll use all of them, and so on - but it’s not 2002 anymore.
The issue is that most games won’t use nearly as many cores as Intel is stuffing on a die these days, which means for gaming having 32 threads via e-cores or whatever is utterly pointless, but having 8 cores and 16 threads of full-fat cores is very much useful.
It’s not that they don’t work.
Basically what you’ll see is kinda like a cache miss, except the stall time to go ‘oops, don’t have that’ and go out and get the required bits is very slow, and so you can see 8gb cards getting 20fps, and 16gb ones getting 40 or 60, simply because the path to get the missing textures is fucking slow.
And worse, you’ll get big framerate dips and the game will feel like absolute shit because you keep running into hitches loading textures.
It’s made worse in games where you can’t reasonably predict what texture you’ll get next (ex. Fortnite and other such online things that are you know, played by a lot of people) but even games where you might be able to reasonably guess, you’re still going to run into the simple fact that the textures from a modern game are simply higher quality and thus bigger than the ones you might have had 5 years ago and thus 8gb in 2019 and 8gb in 2025 is not an equivalent thing.
It’s crippling the performance of the GPU that may be able to perform substantially better, and for a relatively low BOM cost decrease. They’re trash, and should all end up in the trash.