The game still reports that it uses more VRAM than you might have, but nothing happens. You can crank it up to 11 and nothing happens. It runs fine. Something about the RT implementation doesn't work well.
That's because you can't measure VRAM with any accuracy whatsoever. You never could, and you can't now. All you can do is look at it and see that a chunk has gone to the OS and a chunk to the game. Cards with more will automatically use more, too. I used over 11gb on my 2080Ti playing COD MW once (the modern one).
Nvidia know this too. If there was a science to totally prove it and call them out? they would change things. But they know there isn't. They also know how to see it for themselves, and they absolutely knew that putting 8gb on a card would make you have to buy a new one sooner rather than later.
The problem is that every time this issue pops up on other forums I read? one doubter turns up. Then another. All defending their big green boss.
This is absolutely nothing new. Just how ATI and then AMD sold people multiples of the same GPU and then conned them into thinking it actually worked. It was exactly the same as this issue. Every one guy who said "Is it just me or is this a stuttering pile of ass?" was hit with 10 people saying "No issues here, it must be your computer" and so on. What it came down to in the end was that some people were just less likely to notice it, OR, due to the fact they spent a lot of their hard earned money on it were prepared to put up with and live with it.
Hilariously it was Nvidia themselves who did the call out. They made a piece of software that could indeed 100% prove that Crossfire was broken ass. It was called FCAT, and they sent a copy of it to Ryan Shrout. Who now works in Intel's GPU division.
However, in reality it showed runt frames, dropped frames, etc. At which point AMD was forced to change. They did not fix Crossfire AT ALL until the 7990 came out. So that means that not only had they sold MANY multi GPU cards (6990, 5970, 4870X2, 4850X2, 3870X2 and so on?) they had been ripping customers off all of that time.
They fixed it, the fix worked, quelle surprise the FPS tanked. That is because they removed all of the fake frames they added in to make crossfire look better than SLi, which actually did work properly.
So until someone actually comes up with a way to measure this and put it into reality? Nvidia will continue doing this. And believe me, just like how AMD knew Crossfire was a bust they know EXACTLY what cards are coming, how much VRAM they are going to put on them, what games are coming, how much VRAM they are going to use and so on. To believe that someone like that would not know is extremely naive. And full of excuses.
Again I personally feel right now that any one saying it is not Nvidia's fault is just in absolute denial. No one I have come across on the internet with this game who has enough VRAM has had any major problems with it whatsoever. In fact, the very fact that people are blaming the game creator for this issue irks me. We all want better looking games that will push our PCs. All of us. Yet when someone releases one we whine and call it a broken mess.
I will repeat. All they can do is what some other companies have done. Work out what each setting will use in VRAM and create a slider to give you a very rough idea of what it will use on the GPU. Doom had one, Doom Eternal has one, GTAIV had one and so on. However it doesn't change the fact that if you over cook it your game will run like hot ass/and or crash. There is no stopping that, because VRAM is completely different to system ram. It is faster, but it is much cruder also. The latency is crap, replaced by clock speed. One does not bode well doing the other, basically, so sending VRAM allocation to system RAM was always a bad idea, and something to be avoided at all costs. The very fact Nvidia invented and wrote a technology for it? says it all. Texture streaming - look it up. It is very real.