The Last of Us Part 1 has received a new PC patch

Warchild has a 3090Ti, so quite beefy.

I'm personally waiting to get this until after I move in 8 weeks, Should be some more updates out by then.

Oh OK brill. Last time I remember he had a 2080Ti trio was it? that did 2100.

The needed amount NOW is 10gb. The Xbox has that much to access, and the PS5 ain't close off. So, you can immediately tell that this is a PS5 conversion, and it will use 8gb with hardly any eye candy. Crank it up with the textures only for the PC version? yeah, goodbye 8gb.

It won't get better. Quite obviously because we are now starting to get to the dev cycle where PC games are converted from the latest console games (about 3 years into the "new" console's life, or, about right now).
 
I get 120fps maxxed in RE4, it is a shame that a capable card just doesnt have the vram cause otherwise it'd be fine. Going to become more common thou in newer games.
 
The game runs fine even if you exceed the amount of VRAM that you have, as long as RT reflections are off. They are broken.

No game runs fine if you exceed the VRAM you have. It falls back onto your system RAM, which is slower in all of the wrong ways.

I would guess that with RT reflections on it just makes matters even worse.
 
Oh OK brill. Last time I remember he had a 2080Ti trio was it? that did 2100.

The needed amount NOW is 10gb. The Xbox has that much to access, and the PS5 ain't close off. So, you can immediately tell that this is a PS5 conversion, and it will use 8gb with hardly any eye candy. Crank it up with the textures only for the PC version? yeah, goodbye 8gb.

It won't get better. Quite obviously because we are now starting to get to the dev cycle where PC games are converted from the latest console games (about 3 years into the "new" console's life, or, about right now).


IMO I think all GPU's starting from the low mid range should have no less than 12GB what with memory getting cheaper and cheaper to produce, Especially if you go with the more standard memory i.e the non X/HBM types.
 
IMO I think all GPU's starting from the low mid range should have no less than 12GB what with memory getting cheaper and cheaper to produce, Especially if you go with the more standard memory i.e the non X/HBM types.

The memory cost was never the issue bro. Never, not even if Nvidia tried to make it sound like it by saying GDDR6X was expensive.

The issue is they want you to buy another GPU. There is no other way they can limit the lifetime of the card, and as many have piped up and said on the videos I have watched "I am still running a 1080Ti, nice that the memory finally comes in handy".

If the card isn't good for RT? people can disable that and get tons of extra performance. Or they can even use DLSS and so on for more performance. How as a company do you then limit the lifetime with something they can't do that much about? easy, VRAM.
 
Well there will come a time when 16gb is the needed amount, but i'm hoping that amd's next round of cards push that upto a higher amount than their current 20-24gb.

If I'm going to upgrade i'll want it to last give me as much vram as you can.

system ram is also another issue coming 16gb isnt going to be enough 32gb in some cases will become needed.

It's the normal way of things over the years, no different to the amount of storage needed for some games, UE5 is going to push systems hard sooner or later.

Still not sold on AM5 yet thou so going to hold out on that for a while, but i do feel when i upgrade that is going to be needed.

16gb can easily be used now. Easily. A guy demonstrated it on OCUK yesterday using UE5.

I have absolutely and utterly no doubt AMD not will skimp on VRAM. They have no reason to.

The problem here? think game modding. As an example and an old one, when Fallout 3 came out Bethesda released it on very low textures. They did this because they cared enough to want the game to run properly on lower tier GPUs. See also, Fallout NV. At the time? most people had 256mb texture ram, some had (most) 512 and the odd lucky sod at the elite tier had 1gb. As such the game looked like ass.

About two months after launch they unofficially released a texture pack with the game looking how they wanted. And boy, at the time it was stunning. It then used 512mb easy. As time went on the real unofficial mod scene kicked in and at a point you could use 2gb VRAM easily.

It is no different here. If a game can use say, IDK, 10gb on a console? everything you experience better than that on a PC uses more. And it is nothing to do with it being a "console port". That term was always wrong, but it's even more wrong now. These are X86 code games. All they are doing is getting them running on PC wrappers (DX, Vulkan or what not) and making them look better by adding VRAM crunching textures. End of.

"Higher res textures never make a game look better" said absolutely no one ever.

And Ray Tracing? just adds to that 10x. Not in specifically making a game look better, but chewing up your VRAM like it was nothing.

Nvidia know all of this too well. As do AMD. So to me the last crumb of respect I have for AMD is that at least they are not making single gen disposable cards, so there is that.
 
Last edited:
I didn't say it was to do with porting just that vram wasn't the only issue.

Nvidia know full well what they done they work with devs closely enough to know exactly what they done, so a 3070 while capable will have a low life span in the short term any 8gb card.

I expect nvidia will sort this out by the 5000 series, they should do and most likely will as they want people to upgrade but these issues are going to become common place and if them older buyers don't see the pudding they are not going to be buying 8gb and 12gb cards, a 2080ti from a gen behind has more legs on it.

the cost even with ggdr6x isn't massive, not when they are charging the prices that they do, can't say i fully blame them as the fan base lets them.

still my 6800XT will last me long enough to be able to decide what i want when it comes around, but for people that paid good money it's going to start hurting, depends on your mindset really some like low effects with high fps etc just depends.

me i crank everything always have, so i'm looking at 32gb vram or basically double everything on my current if your going to upgrade make it worth it not 30%
 
Last edited:
I see you using the term port a lot. And it is wrong, and always was dude.

And yeah, of course Nvidia know what they are doing. Let's just hope it bites them on the bum hard. Sales are still terrible apparently, so there is hope yet.
 
Port is just a term but as much as consoles are more like PC's than ever the hardware varations is still massive, there isnt an excuse for such a release imho hold it back and sort it.

Nvidia have and always will do these things, no one is forcing them to change tatics, people keep buying them.

Still my 6800XT will be fine vram wise for a while yet which is more than can be said for some cards, i do hope to see 32gb vram from amd next time around but idk my focus is to save for a time as it's going to cost me £1000 minimum.
 
The problem here? think game modding


With some texture mods I can get Cyberpunk easily to bump up to the 24GB on the 4090 and the stutter that happens is very jarring... does it look much better ? Nope, You need to pixel peep to see any real difference ^_^
 
With cyberpunk on insane i get like 50fps idk how much vram it's using thou, but some games do take the whole lot, when it comes to mods thou GTAV I expect i'm pushing the limits on that with the vast amount of mods i have. maybe something like skyrim can be pushed to the max and then some. TLOU and Resi 4 aren't mods and there will be more of them where the lower end cards vram wise are going to struggle.

I think stalker 2 is the other game this year where people are going to get angry if they have similar issues thou. Pretty much any UE5 game needs more of everything my 5800x is going to get hit by them type of games, so AM5 is on the cards at some point.

I expected more issues with hogwarts than i got that seemed demanding but runs smoothly for me.
 
Last edited:
With some texture mods I can get Cyberpunk easily to bump up to the 24GB on the 4090 and the stutter that happens is very jarring... does it look much better ? Nope, You need to pixel peep to see any real difference ^_^

There will always come a point where it makes no difference. Thing is? getting to that point? it clearly does. TLOU on ultra looks better than lower settings. If it didn't they wouldn't have called that one ultra and would not have included those textures.
 
No game runs fine if you exceed the VRAM you have. It falls back onto your system RAM, which is slower in all of the wrong ways.

I would guess that with RT reflections on it just makes matters even worse.
Dude, you don't get it. If you turn off RT, the game literally runs without a single hitch. I don't know if it doesn't report the amount of VRAM correctly, or if something else is happening. It is irrelevant. The game runs absolutely fine, like it's got another 10gb of VRAM to spare. RE4 is not one of the games that requires more than 8gb of VRAM. It requires a patch that fixes RT.
 
Dude, you don't get it. If you turn off RT, the game literally runs without a single hitch. I don't know if it doesn't report the amount of VRAM correctly, or if something else is happening. It is irrelevant. The game runs absolutely fine, like it's got another 10gb of VRAM to spare. RE4 is not one of the games that requires more than 8gb of VRAM. It requires a patch that fixes RT.

The game uses a lot of VRAM.
With RT it uses even more. Like, far more.

What's so hard to understand about that?

RE4 also uses a lot of VRAM. So does Hogwarts. The reason is simple. We are now nearly into the 3 year dev cycle of the modern consoles. Meaning more games are going to come out that use what they have to offer, and anything more (higher res textures, RTX etc) are going to use that and then some.

Maybe if they fix it it can use less VRAM, but I can tell you that is the cause. Mostly because forums I go on and read are full of people with very high end GPUs and none of them have had any issues with the game.
 
The game uses a lot of VRAM.
With RT it uses even more. Like, far more.

What's so hard to understand about that
The game still reports that it uses more VRAM than you might have, but nothing happens. You can crank it up to 11 and nothing happens. It runs fine. Something about the RT implementation doesn't work well.
 
The game still reports that it uses more VRAM than you might have, but nothing happens. You can crank it up to 11 and nothing happens. It runs fine. Something about the RT implementation doesn't work well.

That's because you can't measure VRAM with any accuracy whatsoever. You never could, and you can't now. All you can do is look at it and see that a chunk has gone to the OS and a chunk to the game. Cards with more will automatically use more, too. I used over 11gb on my 2080Ti playing COD MW once (the modern one).

Nvidia know this too. If there was a science to totally prove it and call them out? they would change things. But they know there isn't. They also know how to see it for themselves, and they absolutely knew that putting 8gb on a card would make you have to buy a new one sooner rather than later.

The problem is that every time this issue pops up on other forums I read? one doubter turns up. Then another. All defending their big green boss.

This is absolutely nothing new. Just how ATI and then AMD sold people multiples of the same GPU and then conned them into thinking it actually worked. It was exactly the same as this issue. Every one guy who said "Is it just me or is this a stuttering pile of ass?" was hit with 10 people saying "No issues here, it must be your computer" and so on. What it came down to in the end was that some people were just less likely to notice it, OR, due to the fact they spent a lot of their hard earned money on it were prepared to put up with and live with it.

Hilariously it was Nvidia themselves who did the call out. They made a piece of software that could indeed 100% prove that Crossfire was broken ass. It was called FCAT, and they sent a copy of it to Ryan Shrout. Who now works in Intel's GPU division.

However, in reality it showed runt frames, dropped frames, etc. At which point AMD was forced to change. They did not fix Crossfire AT ALL until the 7990 came out. So that means that not only had they sold MANY multi GPU cards (6990, 5970, 4870X2, 4850X2, 3870X2 and so on?) they had been ripping customers off all of that time.

They fixed it, the fix worked, quelle surprise the FPS tanked. That is because they removed all of the fake frames they added in to make crossfire look better than SLi, which actually did work properly.

So until someone actually comes up with a way to measure this and put it into reality? Nvidia will continue doing this. And believe me, just like how AMD knew Crossfire was a bust they know EXACTLY what cards are coming, how much VRAM they are going to put on them, what games are coming, how much VRAM they are going to use and so on. To believe that someone like that would not know is extremely naive. And full of excuses.

Again I personally feel right now that any one saying it is not Nvidia's fault is just in absolute denial. No one I have come across on the internet with this game who has enough VRAM has had any major problems with it whatsoever. In fact, the very fact that people are blaming the game creator for this issue irks me. We all want better looking games that will push our PCs. All of us. Yet when someone releases one we whine and call it a broken mess.

I will repeat. All they can do is what some other companies have done. Work out what each setting will use in VRAM and create a slider to give you a very rough idea of what it will use on the GPU. Doom had one, Doom Eternal has one, GTAIV had one and so on. However it doesn't change the fact that if you over cook it your game will run like hot ass/and or crash. There is no stopping that, because VRAM is completely different to system ram. It is faster, but it is much cruder also. The latency is crap, replaced by clock speed. One does not bode well doing the other, basically, so sending VRAM allocation to system RAM was always a bad idea, and something to be avoided at all costs. The very fact Nvidia invented and wrote a technology for it? says it all. Texture streaming - look it up. It is very real.
 
The problem is that every time this issue pops up on other forums I read? one doubter turns up. Then another. All defending their big green boss.
I'm not defending them, though. I'm literally saying that you don't necessarily have to spend your money on a new GPU just because your current GPU has 8Gb of VRAM.
 
I'm not defending them, though. I'm literally saying that you don't necessarily have to spend your money on a new GPU just because your current GPU has 8Gb of VRAM.

No of course not :) you can lower the settings. Like I said, I do find them guilty of that. In house testing should have showed them how much it uses.
 
Back
Top