8GB is no longer enough VRAM for High-end 1440p Gaming - AMD Claims

"While AMD's data shows VRAM allocation and not VRAM requirements"

Does it though? No where in the video or anything I have seen claims this?

Looks more like requirements to me personally. Games just claim all the memory available because nothing else will really be used. Gamers use 1 game at a time anyway. We don't really have any tools to see how much memory is actually being used.
 
More is more?

Seems to me more vram requires more bandwidth, or the total time to read trough the vram will increase. This will lead to increased frametimes.

Infinity cache will only help with repetitive access, so it can only make more efficient use of the bandwidth, not really increase it.

The consoles dont even have inf.cache and XBX ”fast” vram is only 10gb. They will be just as limited by bandwidth. it remains to be seen how high vram usage will actually emerge for the new generation. PS5 looks to be best positioned to use most vram @30fps

So I think this is AMD’s VRAM revolution to 30fps gaming.
 
Horizon Zero Dawn 1080p maxed out 5,5 Gb in menues and 11,5 GB
in the ingame benchmark.

Horizon Zero Dawn 4K maxed out 5,5 Gb in menues 13,0 GB
in the ingame benchmark.
 
Seems to me more vram requires more bandwidth, or the total time to read trough the vram will increase. This will lead to increased frametimes.

Infinity cache will only help with repetitive access, so it can only make more efficient use of the bandwidth, not really increase it.

This isn't really true, with predictive caching and prefetching, which is a very mature concept mastered with modern CPUs branch predictors, you DO increase the effective average bandwidth on general transfers, not just repetitive transfers.

The main issue I see with AMD's data, is that with the settings they must be to get this VRAM amount at 1440p, a lot of the games are probably running at like 30 FPS on a RX 6700/RTX 3070 ish card anyway.

There's no way you're actually going to play something like Warzone at these Ultra settings, unless you play only single player.
 
Last edited:
The main issue I see with AMD's data, is that with the settings they must be to get this VRAM amount at 1440p, a lot of the games are probably running at like 30 FPS on a RX 6700/RTX 3070 ish card anyway.

There's no way you're actually going to play something like Warzone at these Ultra settings, unless you play only single player.

This is what I think.

AMD marketed the Fury X card as a 4k card. It had the bandwidth at the time to support it (and the VRAM amount was holding up at the time) but not the raw GPU horsepower. It was a moot presentation. The situation is flipped here. A 3060Ti with 8GB of VRAM is sufficient because in most cases you would run into GPU horsepower limitations before running into VRAM limitations. There may be outliers down the line that betray this, but as with all cases of future games on older hardware, if you just turn the settings down a notch or two, you regain your performance. I know some refuse to do this and that's OK. Those will have to buy the very best of the best whenever it is available, in other words, each new generation. I personally don't mind turning a setting or two down because I'd rather spend less and still have a great experience.
 
I disagree from experience.

The Fury X was plenty fast enough at the time for 4k. I know because I had one (and then two, but let's not go there) and it had no problem dishing out games at 4k. Which at the time were new titles like Fallout 4, Dying Light and so on. It had plenty of horsepower to get you 40 FPS or more, which at the time was about the expectation of a single GPU in 4k. Times were very different.

What I do disagree with are comments about it had enough VRAM. It did not, and again I experienced this quite a few times first hand. The main culprit was BLOPS II. It would run out of VRAM, start crawling and then crash. Like, hard lock the rig. Activision "fixed this" by introducing a sniffer for the Fury X which disabled "Extra" detail. And thus alleviating the issue with the Fury X, if you were happy with sub par graphics over the 980Ti, which did have enough VRAM to handle it. You could hack it back into action quite easily but the same problem persisted.

AMD did eventually "fix" this of course. By "fix" I mean copy Nvidia's past tricks of streaming the rest into your paging file. This stopped the crashes but would net you single digit FPS in those areas, sometimes as low as 4. It was not all areas of the game which suffered this issue, just certain parts in certain levels.

Now we move onto 8gb. Which seems to be the mainstream amount for 1440p cards. It's not enough. This has already been conclusively proven over several games, even at 1440p. The 3070 should remain on par with the 2080Ti at all times but it does not, and, that means that the 2080Ti is still a semi capable 4k card. How long that will remain to be true? IDK.

Now usually at the suggested prices for these 30 cards? it wouldn't be a huge deal. However, when people are selling the 3070 scalped at as much as a 2080Ti would have cost early last year? yeah, then we run into issues.

As for reducing settings? sure, every one can do that. However, even at the retail prices for these cards I would not want to do that. Especially when I am doing it because Nvidia were tight and squeaky. They still want top dollar for their cards, but they are *never* going to give you the performance you would expect for the timeframe you would expect. Usually spending £600+ you would expect to get a whole gen out of your card (so skip the next one) which means four years. Can you honestly say that 8gb will last four years? with the new consoles out and next gen graphics and stuff on the way?

It's very easy for someone to sit there and say it is not a problem, when they have never experienced the problem themselves. If cancer does not run in your family then you could think you have little reason to worry. That does not mean you won't get it yourself.

And yes, I totally agree that too much VRAM does nothing for anything. It doesn't even make your Epeen look bigger because it does absolutely nothing for performance. However, when it is just not there and you need it you are totally screwed.

TBH this should not even be a discussion. At the prices being charged for these GPUs, especially AMD's which lack features this should never ever be an issue. At least in their case they have made sure it won't be. Nvidia on the other hand? are starting to play the game of trying to make you buy *every* single new refresh which IMO sucks ass.

If, say, the 3070 soon reaches the point where 8gb is not enough and that becomes gospel around gamers? it won't be worth a toss. Which is exactly what Nvidia want.

For many years now their second hand GPUs have held onto a reasonable resale value making them not such a terrible proposition. However, you go around now and look at what the "useless" cards are worth now. GTX 780, 3gb 1060 etc. They ain't worth diddly.

And again, that is exactly what Nvidia want.
 
Back
Top