WYP
News Guru
Does Nvidia's RTX 30 series have too little VRAM?

Read more about Godfall using 12GB of VRAM at 4K Ultra settings.

Read more about Godfall using 12GB of VRAM at 4K Ultra settings.
That's a lot of VRAM. Should be a gorgeous game. If this becomes the norm, the 3070/3080 as Alien has always said will become inferior products even if only in absolute terms.
I wonder if it's is a deliberate attempt by AMD to outdo Nvidia.
That's a lot of VRAM. Should be a gorgeous game. If this becomes the norm, the 3070/3080 as Alien has always said will become inferior products even if only in absolute terms.
I wonder if it's is a deliberate attempt by AMD to outdo Nvidia.
The problem that AMD has now will be normalising higher levels of VRAM usage.
I believe Godfall 12 GB VRAM requirement for 4K is just artificial amount. This AMD sponsored / bundled game is clearly promoting RX 6000 card with 16GB VRAM over RTX 3080 with 10GB VRAM.
I don't think that's quite 100% the reason, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.Usually when you give a dev options they will take them tbh.
One thing we never see on PCs is optimisation like the consoles get later in their life. Like, if you go back and compare a launch title to one near the end of life? the launch titles look terrible and don't run much better.
Sadly GPU technology in PCs is never around long enough.
I don't think that's quite true, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.
I don't think that's quite 100% the reason, if you took say a HD7870XT it would run a lot of 2019 games at 1080p about as well as it ran 2013 games while the games would look a lot better too, a lot of the progress comes from better techniques generally and making better use of the overall philosophy of architectures/APIs than fully hardware specific optimisations.
Nvidia making a mistake with VRAM? Noooo, *cough 970. *cough 780.
Oh, I know it's deliberate. I'm still mad that they gimped the 780 with 3GiB rather than 4GiB, my card went from top end to irrelevant overnight because of that.First things first. It's not a mistake. AT ALL. 970 wasn't a mistake. It was completely deliberate. Everything they do is deliberate. They knew exactly what they were doing with the 970, and they know exactly what they are doing now.
Oh, I know it's deliberate. I'm still mad that they gimped the 780 with 3GiB rather than 4GiB, my card went from top end to irrelevant overnight because of that.
Two questions:
Since I'm still playing at 1080p and will most likely only upgrade to higher refresh rates instead of resolution, I would have no problem with Nvidia's offerings and my 1070 Strix of course doesn't get even to half of its 8GB of VRAM.
- How much VRAM does a current game at 4k actually use?
- Does Godfall support DLSS?
Now, if Godfall supports DLSS, then 4k gaming would be no problem even with 10GB of VRAM, right? Or did I get something wrong there? If I didn't then I see absolutely no problem, especially since I haven't seen a completely used up VRAM once.
It doesn't support DLSS. And it depends on the game TBH. DOOM Eternal can do well over 8GB at 4K max settings.