Nvidia DLSS 4 is here, and it improves everything

WYP

News Guru

Nvidia has revealed DLSS 4, and it transforms every aspect of DLSS.​


Nvidia-DLSS-4.jpg


Read more about Nvidia's DLSS 4 technology.
 
I see this as not factually correct you're getting higher frames now compare it too dlss4 turned off for clarity and accuracy.

Ain't no one going too convince me otherwise until i see facts cause sure AI has it's uses these days i use it in different ways.

But they market dlss4 or any dlss as a reality of product hardware performance and this is 100% rubbish none of the 5000 series are 2x performance they are 2x fps in dlss4 but in terms of raw performance they are 30% at most just like every generation since the dawn of the first gpu.

You all be happy with the placebo marketing if you want but i don't agree and sure AI has it's place in different ways but i honestly would prefer them to stick too facts of the hardware and not the latest software gimmick.

So when you get round to testing and nvidia is begging you to put out dlss4 data and market it in your reviews and videos oh well we all know that it's 30% so I'd at least like to see that factual data even if you need to do a separate video.

Cause we all know the realitys of marketing in this tech world they have all been caught out before for fudging the numbers thats why we look for good reviewers with hard data and not the latest slide of marketing from any company not just nvidia.

But there I'm not attacking anyone I'm simply stating things as i see them but overall not the worst pricing that it could have been.
 
With Nvidia’s new DLSS Frame Generation model, they have boosted speed by 40% and reduced VRAM usage by 30%. For Warhammer 40K: Darktide, this results in a 10% boost in framerate and a 400MB reduction in VRAM usage. In other words, DLSS Frame Generation just got better.
40% Boosted speed? When using frame generation, does that mean improved latency as well? I whish GPU manufacturers had more focus on that, and not just getting more frames. Also, I wish Nvidia would stop locking new gaming related software tech to NG GPUs..
 
Only areas I find interesting are improved DLSS and improved DLAA, Honestly couldn't give a crap about frame gen.
 
That bit when he said the 5070 is as fast as a 4090.

LMFAO.

I turned it off after that.
Who said that? It's not in the article.

EDIT: Ah, right, the CES keynote.
He says it with such pride as well. Even if it was actually true, what a huge slap in the face to everyone that has bought the 4090 at that price.
 
Last edited:
It's never going to be that fast. Ever.

Maybe in one game with all of the DLSS on and it being a new version that won't work on 4000 cards and etc etc etc.

Its also hobbled with 12gb. So basically if it was as fast as a 4090 (it isn't) that would put it in the 4k bracket. But 12gb is nowhere near enough for 4k so it's caveat emptor.

I'll never forget the 3070 hype. Faster than a 2080ti (it wasn't it was level at best) and all of the people gloating about how stupid 2080ti owners were for paying that much etc etc.

Yet ONE year later their cards were hobbled and being replaced due to not having enough VRAM even for 1440p.

Just give me AMD. Every time.

I watch lots of PUBG streamers (Halifax, Hollywood Bob, Choco Taco, SimplyMatthius etc) and their games crash several times during a stream. And they're all on Nvidia. Me? I can play for 5 hours without a single crash.

Besides. Here we are again thinking all this performance they are boasting of will make games better (it won't) and better quality games (it also won't).

I bought one of the very few good games of 2024 (Indy Jones) and my 6950XT demolishes it at 1440p ultra. Which makes you wonder, what's the frigging point of dumping two grand on a 5090? None. Sure it may run Alan Wake 2 very well but that isn't even a good game. It's tiny in scale, confusing trying to find where you need to go but most importantly boring once the graphics have been seen for a few hours.
 
Unfortunately AMD is following in their footsteps, the expansion to FSR4 with improved features, will not apply to all FS3 supported cards (as of now). Although understandable, it's a shame IMO.

Never used FSR. It ruins BR games and I'd rather lower the settings if and when I have to.

I don't frame count any more dude. It's a mugs game. As long as a game is smooth enough to play I couldn't give a fudge.

AMD 6000 series was super focused on pure raster, and so that's what you got. FSR didn't even exist when it first launched it was just talked about.

I don't buy things based on hot air. And I won't either.

I also duly noted how RT cores apparently are unchanged since the 4000 cards which just backs up my belief that Nvidia have moved on from gaming and care even less about gamers than they ever had (which was always very little).

The only card I'd ever consider is the 5070 and it's automatically off the buy list due to the 12gb. It's just more contempt like the 3070 and all of their other mainstream cards. There's always a catch with those to make them useless after a year or two which for over £600 is just a joke, and, why I now use a 6800XT, 6900XT and 6950XT.
 
I have a fairly senior role in marketing and even as a marketeer I hate this s**t. We should be comparing apples to apples, not counting "fake" frames and calling it a performance boost. Especially when it comes at a small jump in latency as well. Sure the game might appear smoother, but it's not representative of raw horsepower as far as I understand it.

Also... As the leader in GPUs they should be making cards for 1440p and 4k. Sacrificing VRAM just to squeeze out that extra margin is typical. I know they are beholden to share holders interests, but I wish growth could also go hand in hand with not taking the mick in the consumer interest.
 
I don't think 12GB at the price of $550 is too too bad. I mean, yeah, it sucks and I wouldn't buy it or recommend it unless you were happy to turn settings down that you shouldn't have to, but it's not as bad as other cards have been and is a little overblown in my opinion by the media and others. And realistically, a 5070 with a 256-bit bus and 16GB of VRAM would have likely been $650. The issue with development at this stage is always going to be decisions early on what amount of VRAM it will use based on its bus width, and the bus width is an integral part of the design and die size. Maybe Nvidia could have done a weird bus width for the 5070 and put 14GB in it, but the difference between 12 and 14 is not always going to be 'game not work' and 'game work'. As with many instances, adjusting a few settings can give you the performance you need with 12GB of VRAM at 1440p. This sucks and I don't think gamers should have to do it, so I won't be recommending the 5070 to those that want to avoid tinkering and tweaking. But still, I don't think it's 'DOA' or anything extreme like that. I just think Nvidia had a hard decision to make and chose to make a 'cheaper' card with compromises. We know well that if Nvidia didn't compromise, they would have charged for it.

What I think Nvidia should have done is, based on what I understand to be realistic and not just the classic, 'I'm a consumer so I should always get what I want and will complain like a Karen when I don't'.

5060 - 12GB at around $400
5070 - 16GB at around $550
5070ti - 16GB at around $700
5080 - 24GB at around $900
 
It's not going to be 550.

You'll be looking at 600 minimum so same as last time. And no doubt the same performance uplift as last time too.

FE cards are almost a myth and for 600 you'll get le plastic fantastic from PNY or something.

The most fake thing about Nvidia are their RRPs. Surely you've learned that by now?
 
It's not going to be 550.

You'll be looking at 600 minimum so same as last time. And no doubt the same performance uplift as last time too.

FE cards are almost a myth and for 600 you'll get le plastic fantastic from PNY or something.

The most fake thing about Nvidia are their RRPs. Surely you've learned that by now?

Yeah, if $550 is an illusory price, sure, $600 for 12GB is just not worth it. If this compression technique they have was available in all games, maybe, but seemingly it's not.
 
It won't be. Once again they are putting the onus on devs who should bother why?

See also Indy Jones. RT or FO. Lots of vram or FO.

It's just yet another smoke and mirrors isn't it? Just download more vram /rolls eyes.

It's deliberate as always. They want you queuing up for the 6070.

What puzzles me the most right.... The people who will buy the 5090 would have bought the 4090. What are they going to do with it? Play Alan Wake again?

I don't get it. Like, why people actually buy these cards. I mean that was a loaded question I do get it. Their lives are so empty and meaningless their ego needs propping up due to its fragility but like I said it doesn't make games magically appear does it? You're just stuck playing through the same ones *scratches head*.
 
BTW going back to the AI vram thing.

If it was that good why does the 5090 have so much?

I reckon it's a trap tbh.
 
Back
Top