Doesn't make it any less of an issue though.
I tend to side with you on this, although I think the responsiblity of the devs versus nvidia(&amd) is a nuanced debate.
That’s the games fault.
In addition, developers are prioritising eye candy through heavier textures instead of game play mechanics to sway consumers.
I'm referring to 3080 owners (some who frequent this very thread) that can't play some games at 1440p without stuttering or turning down settings. Let alone 4K.
As a 3080 owner, I'm curious to know what games stutter a 3080 at 1440p.
For the games the Mrs and I have played on our machine on a ultrawide 3440x1440, the 3080 is chugging along nicely on max settings. I'm guessing they're not GPU heavy? – Forza Horizon 5, Horizon Zero Dawn, Diablo 4, A Plague Tale Requiem, Star Wars: Fallen Order to name a few.
Quickly reading back, it seems 10gb is small nowadays? Eesh.
However, I see Starfield as a game that'll likely make me think a 3080 ain't good enough lol.
developers are prioritising eye candy through heavier textures
It's a little bit worse than that:
- They're using higher resolution textures (higher than the console versions in most cases, often just enough to push the memory use beyond 12GB but less than 16GB, but oddly, the console version of the textures are missing as a LoD (They'd often fit in 10-12GB of vram, given the consoles have 16GB of total RAM which must fit system, application, game data and textures))
- Doing a poor job of level of detail both in multiple texture versions (high then big jump to potato vision) and in loading excessive texture for distant objects that can't possibly display the full resolution.
- Unusually poor texture memory management
- Not using Nvidia and Microsoft technologies to reduce texture memory consumption.
I've suspected that this is not an accident for a while, particularly for some titles, but with the suspected DLSS blocking, I think it's getting harder to pass off as a coincidence. I don't know if developers are agreeing to it like DLSS, or if they're using a vendor library that just "happens" to have these characteristics (Hardly the first time).
For the games the Mrs and I have played on our machine on a ultrawide 3440x1440, the 3080 is chugging along nicely on max settings. I'm guessing they're not GPU heavy? – Forza Horizon 5, Horizon Zero Dawn, Diablo 4,
I have a 5800x3d, 32gig ram and a 10gb 3080 on the same resolution. Diablo 4 definitely runs out of vram on max textures and requires them to turned down to prevent stuttering.
As a 3080 owner, I'm curious to know what games stutter a 3080 at 1440p.
For the games the Mrs and I have played on our machine on a ultrawide 3440x1440, the 3080 is chugging along nicely on max settings. I'm guessing they're not GPU heavy? – Forza Horizon 5, Horizon Zero Dawn, Diablo 4, A Plague Tale Requiem, Star Wars: Fallen Order to name a few.
I play heaps of games on my 3080 10GB. Forza Horizon 5, Diablo 4, RE4 Remake, FF7 Remake, Guardians of the Galaxy, hundreds of games on Steam and Xbox Game Pass.
Everything prior to 2023 ran perfect at 1440p, without DLSS. Some games I could even do downscaling by rendering at 4K and outputting at 1440p for more sharpness. But then in 2023, the two that I played heaps of showed issues with 10GB at 1440p if I maxed it out. RE4 Remake was one.
Diablo 4 is getting even more interesting. I tried turning off DLSS yesterday and setting textures to High instead of Max. It was HORRIBLE. Stuttering here and there, and the worst was 2 seconds of stuttering which ended up with my character being killed in a helltide. I know others have said that it could be internet lag, but no, everything stopped, including my mouse cursor, and the chrome window I had open in my other monitor. Turning on DLSS Quality, things improved but it still happens at a much rarer rate, usually after an hour or two. Looks like for problem free gaming with D4, I need to try DLSS Quality and medium textures.
That’s the games fault.
Not my problem, all I want to do is play the game.
although I think the responsiblity of the devs versus nvidia(&amd) is a nuanced debate.
AMD & Nvidia liase with devs in advance to know what the requirements for things like VRAM amounts are going to be in the near future. Current gen consoles have 16GB total, about 13GB of that is usable (+/- 1GB) VRAM for games, so it's a given that 12-16GB of VRAM is becoming a target over time.
They then take whatever the developers say they think they're going to need and arrange the amounts optimally to ensure the most cash is extracted from the buyer of the GPU at a given performance level (e.g. the x70 class buyers). Basically, it's the devs job to let the people that make the GPUs know how much they're going to need, and then it's AMD/Nvidia's job to supply that much somewhere in the product stack.
Or you could be like Nvidia, and not care about your x80 class buyers, sell them a 10GB GPU only to recommend they sidegrade to the 12GB version later on. :)
developers are prioritising eye candy through heavier textures
^ Also this.
I'm curious to know what games stutter a 3080 at 1440p.
I don't have one myself, but Sleepycat and other 3080 owners have posted in here various times over the last few months about what games run into issues. I know Diablo 4 is a notorious VRAM hog (though they've stopped it eating 28GB of my system RAM while I'm playing, so there's that) though.
However, I see Starfield as a game that'll likely make me think a 3080 ain't good enough lol.
I do indeed see that being the case for a lot of people. I think Starfield is going to see those that invested in 12GB GPUs like the 6700 XT, 3060 12GB and the 3080 12GB be able to get just a bit more out of their purchase than <10GB buyers can.
I have a 5800x3d, 32gig ram and a 10gb 3080 on the same resolution. Diablo 4 definitely runs out of vram on max textures and requires them to turned down to prevent stuttering.
Diablo 4 is getting even more interesting.
Interesting.
Mrs has everything on ultra, with HD textures and i believe dlss quality. 99+% flawless.
Only stuttering she ever experiences generally lasts under a second and only after teleporting to a town. Everywhere she claims is perfectly smooth.
- 7950X, 64gb RAM, 10gb 3080.
Interesting.
https://www.youtube.com/watch?v=2Rl6sFoeOSU
Digital Foundry found that only 16GB GPUs can match the visual quality of the PS5 version of D4 (5:15 onwards). Tons of reports all over Reddit from people without 16GB GPUs complaining of VRAM stutter as well.
Perhaps she's not noticing it, in which case it's arguably not an issue.
Digital Foundry found that only 16GB GPUs can match the visual quality of the PS5 version of D4 (5:15 onwards). Tons of reports all over Reddit from people without 16GB GPUs complaining of VRAM stutter as well.
I'll have to give that a watch when I get home, but from my quick skim through, it seems that's for 4k gaming.
Perhaps she's not noticing it, in which case it's arguably not an issue.
Possible reason.
but from my quick skim through, it seems that's for 4k gaming.
The video mostly focuses on 4K, I linked it because Ultrawide 1440p is much more pixels than 1440p is. From memory 4K has around 8.2 million pixels, 1440p is around 3.6 million pixels, and ultrawide 1440p is around 5 million pixels. I'm probably a bit off.
If I Google "Diablo 4 VRAM usage 1440p I get a ton of results from people posting on Blizzard's forums and Reddit about how 8 and 10GB of VRAM isn't enough for 1440p. The game definitely has some quirks though, given it recommends 32GB of system RAM when you increase the texture quality...
given it recommends 32GB of system RAM when you increase the texture quality...
That's fast becoming the norm for a lot of games now too :)
It used to be, that the only game which really needed that much RAM was Minecraft.
Gawd, I haven't had a daily use computer with less than 32GB RAM in over a decade.
Playing certain Minecraft modpacks basically requires that you allocate 16GB of RAM to Java for a buttery smooth gameplay.
AAA games sure demand a lot
100GB+ file sizes and VRAM approaching Enterprise/Data Center GPU's
Wouldn't be surprised that one day we have Consumer 48GB Cards like the RTX 6000.
But I am really curious as to where the VRAM is going to in Diablo 4 – I mean visually its nothing to write home about – Its Art Style is what holds it up for me personally
It used to be, that the only game which really needed that much RAM was Minecraft.
Good times.
That's fast becoming the norm for a lot of games now too :)
Not really much anyone can do about it either. People have always bought PCs with the specs in mind that the game(s) they play recommend. It's just those spec requirements are shifting.
8GB of system RAM was plenty for quite a long time; you'd have 4GB for your games and 4GB for Windows, and everything was happy. Then it was 16GB was "more than enough" as games started to get larger, and now you've got each new CoD eating almost 16GB of system RAM on its own at times so 32GB is becoming the requirement for some systems.
Wouldn't be surprised that one day we have Consumer 48GB Cards like the RTX 6000.
Probably only a matter of time. We're already halfway there with 24GB GPUs.
But I am really curious as to where the VRAM is going to in Diablo 4 – I mean visually its nothing to write home about – Its Art Style is what holds it up for me personally
It likely has something to do with how much the engine is keeping in RAM (both VRAM and system RAM) instead of unloading and reloading things when you traverse between areas. One game I play, Fallout 76, will unload everything in use from the previous area when you enter a loading screen to go into a new area, e.g. entering a building from the open world, and then starts filling up VRAM and system RAM until it's done (which is when the loading screen finished). I can watch it do this in real time on Task Manager.
Whereas Diablo 4 seems to like to keep as much available as possible on tap. Not necessarily a bad thing if managed well, but the game definitely isn't managing it well at times. There's ultimately nothing you can do about this either, other than have a system that performs well enough for your standards.
If I Google "Diablo 4 VRAM usage 1440p I get a ton of results from people posting on Blizzard's forums and Reddit about how 8 and 10GB of VRAM isn't enough for 1440p
True, but I've also seen people on the Blizzard forums complaining their 4090 runs out of VRAM in that game too.
Interesting.
Mrs has everything on ultra, with HD textures and i believe dlss quality. 99+% flawless.
Only stuttering she ever experiences generally lasts under a second and only after teleporting to a town. Everywhere she claims is perfectly smooth.
- 7950X, 64gb RAM, 10gb 3080.
Mine's a 5900X, 64GB. The only difference would be that I run 2 monitors and have Chrome in the other window. When just in Windows, with my Chrome window open, it uses 1.04 GB of VRAM. When Diablo 4 is running, it uses everything, going up to 9.8 GB and higher (based on GPU-Z).
I get the teleporting to town stutter too, but mine happens when loads of action happening, such as during Helltide. When I am in a dungeon, the stutter is less frequent than when in the open world area.
True, but I've also seen people on the Blizzard forums complaining their 4090 runs out of VRAM in that game too.
D4 uses less than 8GB with settings below "epic" for textures, I can't use epic, because it crashes all the time, but it uses about 12GB for that (4080) for a while, seems to creep up a lot though (issues freeing memory?).
Frankly the visual difference is barely apparent, and the issues are more the game's fault than the card's if you ask me. There seem to be a lot of really badly done games lately.