Huge RTX 5090 performance gains teased by reliable leaker

I shudder at the thought on the size of these colossal GPUs now. I know the picture is a placeholder, but if these cards continue to grow, we will soon be having motherboard connect to a GPU, not the other way around :D
 
All aboard the hype train choo choo!! 70% uplift in one generation? So is that Nvidia throwing cores as a solution? have they found a nicer architecture? or is it that 2x through 4x were pretty crap and they've decided to let the 'A' design team have a play on the gamer designs this gen as a reward for the stellar Enterprise work they've done. :D
 
Well we've got roughly a year and a few months until they release the new cards, A lot can happen in that time.

All aboard the hype train choo choo!! 70% uplift in one generation? So is that Nvidia throwing cores as a solution? have they found a nicer architecture? or is it that 2x through 4x were pretty crap and they've decided to let the 'A' design team have a play on the gamer designs this gen as a reward for the stellar Enterprise work they've done. :D

From what I've read they are using advanced AI to help them design GPU architectures that will let them get the most optimized designs.
 
Well we've got roughly a year and a few months until they release the new cards, A lot can happen in that time.



From what I've read they are using advanced AI to help them design GPU architectures that will let them get the most optimized designs.

Advanced AI?? AI needs to be led, needs to be trained. Here you go AI, design me a future state GPU - go to the reddit/nvidia forum for specs...

Edit - I'm sorry, sounded harsh but i'm seeing AI tagged onto everything and anything - My initial premise still holds though - 70% is a big lift, where though? in Raster, in RT? at what resolution?
 
Last edited:
Advanced AI?? AI needs to be led, needs to be trained. Here you go AI, design me a future state GPU - go to the reddit/nvidia forum for specs...

Edit - I'm sorry, sounded harsh but i'm seeing AI tagged onto everything and anything - My initial premise still holds though - 70% is a big lift, where though? in Raster, in RT? at what resolution?

Nvidia AI is already fault finding code and developing its own code to be more efficient. if its 70% because of AI coding, I would actually believe it. AI is rapidly advancing last 2 years, maybe more than I would like.
 
Nvidia AI is already fault finding code and developing its own code to be more efficient. if its 70% because of AI coding, I would actually believe it. AI is rapidly advancing last 2 years, maybe more than I would like.

This isn't anything new. Ubisoft has done this for years. Still hasn't helped much. Not saying they aren't as advanced as Nvidia but fixing code for a 70% improvement seems nigh on impossible right now.
 
imho it has nothing to do with AI it's to do with going 3nm, only reason they cant atm is Apple have taken all the 3nm supply from TSMC for the first year.

The chip won't be any bigger at most 900mm2 as beyond that you get tons more defects and power issues, they can only go so big and in time will have no choice but to go chiplet.

70% seems possible, but I expect lower and if when Nvidia market it and say some BS like it'll be 2x a 4090 then expect much lower they don't half talk some BS at times.

but yea 3nm is why.
 
I wonder how much of that 70% is using DLSS and Frame Gen, after all nVidia, AMD and Intel these days are only interested in using Fake Frames to sell cards, not native rendering.
 
I wonder how much of that 70% is using DLSS and Frame Gen, after all nVidia, AMD and Intel these days are only interested in using Fake Frames to sell cards, not native rendering.

You may be speaking in hyperbole, but these companies can't keep cramming cores into tiny spaces. They have to use intelligent means to push performance. A frame is already a 'fake' frame. It's just digital information. I want FSR and DLSS and XeSS to succeed. I see no drawbacks that can't be resolved via continuous innovations in the technology. If we gain performance through upscaling or other 'fake' means, what difference does it make? DLSS 3 has issues, but what if those issues were resolved? Would you still consider it to be 'fake', or an inferior version of the real thing?
 
You may be speaking in hyperbole, but these companies can't keep cramming cores into tiny spaces. They have to use intelligent means to push performance. A frame is already a 'fake' frame. It's just digital information. I want FSR and DLSS and XeSS to succeed. I see no drawbacks that can't be resolved via continuous innovations in the technology. If we gain performance through upscaling or other 'fake' means, what difference does it make? DLSS 3 has issues, but what if those issues were resolved? Would you still consider it to be 'fake', or an inferior version of the real thing?

DLSS is not the same as Frame Generation though. DLSS outputs near native resolution, it's not new it's using a long time method but with modern technology to give it even better benefits.

Frame Generation otoh, is completely made up data that realistically makes the fps appear bigger but it's not actually improving the game other than reducing time between native rendered frames, making it appear smoother visually with no benefits to as far as I'm aware to improving input latency and similar akin to what it would be at that actual rendered fps. In fact input latency is actually increased using it.
 
Last edited:
You may be speaking in hyperbole, but these companies can't keep cramming cores into tiny spaces. They have to use intelligent means to push performance. A frame is already a 'fake' frame. It's just digital information. I want FSR and DLSS and XeSS to succeed. I see no drawbacks that can't be resolved via continuous innovations in the technology. If we gain performance through upscaling or other 'fake' means, what difference does it make? DLSS 3 has issues, but what if those issues were resolved? Would you still consider it to be 'fake', or an inferior version of the real thing?

Sorry but to me whilst DLSS is good in allowing older cards to stay usable, it is also deceiving people in to thinking a newer graphics card can perform better than it can.

nVidia and I believe AMD have been marketing cards as "4K Triple A Ultra Quality Gaming Capable" but they are only capable of doing it with DLSS/FSR, so not really "Capable" of doing it, just like the was it 3090 as "The worlds first 8K Gaming GPU" but only when rendering a potato quality game at 4K natively and upscaling to 8K, because as soon as you render at 8K natively your drop to single digit FPS.

That to me is "Fake Performance" because if the card cannot render the game at 4K natively, then it shouldn't be marketed as such, but they do get marketed as that because of DLSS/FSR.

Then you have "Frame Gen", well NBD said why that is a problem.

I know that the likes of nVidia, AMD and Intel cannot just keep "adding cores" but they also cannot just keep adding "Upscaling will solve everything", and increasing pricing, giving us effectively lower performing cards for twice or 3 times the money thanks to "Upscaling".

I just don't think people should be happy about having to rely on DLSS/FSR in order to make a game playable especially on "High End" cards.
 
Sorry but to me whilst DLSS is good in allowing older cards to stay usable, it is also deceiving people in to thinking a newer graphics card can perform better than it can.

You still need raw power to make DLSS work at its best. Not everyone is content with performance mode DLSS where graphic immersion is sacrificed just to get an average fps count.
 
DLSS is not the same as Frame Generation though. DLSS outputs near native resolution, it's not new it's using a long time method but with modern technology to give it even better benefits.

Frame Generation otoh, is completely made up data that realistically makes the fps appear bigger but it's not actually improving the game other than reducing time between native rendered frames, making it appear smoother visually with no benefits to as far as I'm aware to improving input latency and similar akin to what it would be at that actual rendered fps. In fact input latency is actually increased using it.

Yep. I used DLSS as a broad description just like Nvidia have done. Which is kind of annoying really because like you said they're different methods. But I used both in the same context because they both push frame rates higher, which is the whole point of bigger and faster architectures. And that's my point. However companies push frame rates, I'm down for that. If frame generation has its downsides, and it does, maybe that can be improved upon in future generations? This to me is the same as any other form of progression, 'fake' or 'real'. Like DLSS when it first dropped, frame generation could improve a lot over the next few years.


Sorry but to me whilst DLSS is good in allowing older cards to stay usable, it is also deceiving people in to thinking a newer graphics card can perform better than it can.

nVidia and I believe AMD have been marketing cards as "4K Triple A Ultra Quality Gaming Capable" but they are only capable of doing it with DLSS/FSR, so not really "Capable" of doing it, just like the was it 3090 as "The worlds first 8K Gaming GPU" but only when rendering a potato quality game at 4K natively and upscaling to 8K, because as soon as you render at 8K natively your drop to single digit FPS.

That to me is "Fake Performance" because if the card cannot render the game at 4K natively, then it shouldn't be marketed as such, but they do get marketed as that because of DLSS/FSR.

Then you have "Frame Gen", well NBD said why that is a problem.

I know that the likes of nVidia, AMD and Intel cannot just keep "adding cores" but they also cannot just keep adding "Upscaling will solve everything", and increasing pricing, giving us effectively lower performing cards for twice or 3 times the money thanks to "Upscaling".

I just don't think people should be happy about having to rely on DLSS/FSR in order to make a game playable especially on "High End" cards.

Deception is down to marketing and consumer ignorance, not technological advancements. I totally understand your frustration with the marketing of it; I agree with you. But I still believe a brighter future is in optimising upscaling and frame generation technologies over brute forcing cores and die sizes. That's really what I was trying to say.

I wouldn't be happy to rely solely on DLSS/FSR/XeSS or frame generation of any kind either—but we're not. The standard 'core for core' performance uptick of the 4000 series is well in line with previous generations. The issue is the pricing and the marketing. Again, that's not architectural. DLSS, frame gen, they are on top of an improved performance metric. The issue with Nvidia's 4000 series is the pricing, VRAM capacity and bandwidth, and marketing; it's not raw performance or upscaling/frame gen.
 
Deception is down to marketing and consumer ignorance, not technological advancements. I totally understand your frustration with the marketing of it; I agree with you. But I still believe a brighter future is in optimising upscaling and frame generation technologies over brute forcing cores and die sizes. That's really what I was trying to say.

I wouldn't be happy to rely solely on DLSS/FSR/XeSS or frame generation of any kind either—but we're not. The standard 'core for core' performance uptick of the 4000 series is well in line with previous generations. The issue is the pricing and the marketing. Again, that's not architectural. DLSS, frame gen, they are on top of an improved performance metric. The issue with Nvidia's 4000 series is the pricing, VRAM capacity and bandwidth, and marketing; it's not raw performance or upscaling/frame gen.

My frustration is down to them marketing cards as being capable of more than they are, it's deception even if it's mainly fallen for by those who lack the understanding or willingness to confirm the marketing.

Frankly if it were not for no other manufacturer being in the industry I wouldn't use AMD/Intel or nVidia due to it.

My problem is also game devs relying on these features instead of making the games run smoothly.

It's one of the reasons I gave up on Elite Dangerous, a 3080Ti getting less than 30fps on a planet surface because Frontier refuse to fix the issues with the performance, and instead want you to use the "Upscaler".

I get your point though, if nVidia, AMD and Intel combined with Game Devs actually worked on these types of features and advertised the card's and games as such so as to not, intentionally deceive people then they would be received better!
 
Nvidia marketing is like a circumcision.

“We’re just going to snip it a bit so it looks bigger”
 
Yep. I used DLSS as a broad description just like Nvidia have done. Which is kind of annoying really because like you said they're different methods. But I used both in the same context because they both push frame rates higher, which is the whole point of bigger and faster architectures. And that's my point. However companies push frame rates, I'm down for that. If frame generation has its downsides, and it does, maybe that can be improved upon in future generations? This to me is the same as any other form of progression, 'fake' or 'real'. Like DLSS when it first dropped, frame generation could improve a lot over the next few years.

It's not higher is the point. It's just a marketing technology. Nothing to do with performance other than making it worse. Generating fake frames does not improve anything, which just makes half your point weak. It can be improved sure but until their Tensor cores can become on an equal level to push frames out as fast as the regular Rasterization cores, it's purely marketing. They have a long way to go.
 
It's not higher is the point. It's just a marketing technology. Nothing to do with performance other than making it worse. Generating fake frames does not improve anything, which just makes half your point weak. It can be improved sure but until their Tensor cores can become on an equal level to push frames out as fast as the regular Rasterization cores, it's purely marketing. They have a long way to go.

Right now, no, I wouldn't use 'fake frames'. Right now, yes, it's mostly for marketing reasons. But again, I don't want Nvidia or others to stop working on projects like this. I want them to continue to improve them to see if they are genuinely viable as an aid to rasterization for those that would benefit from it.
 
Right now, no, I wouldn't use 'fake frames'. Right now, yes, it's mostly for marketing reasons. But again, I don't want Nvidia or others to stop working on projects like this. I want them to continue to improve them to see if they are genuinely viable as an aid to rasterization for those that would benefit from it.

I did a test a few weeks back with a friend of mine, He is quite the PC novice but he's been on various PC gaming social media pages long enough to get sucked into the cult that irrationally hates frame generation or "fAkE fRaMeS".

He came round and really wanted to play Spiderman Remastered as his rig isn't up to the task at decent settings, He was really enjoying himself but I had frame generation disabled.

He went to the toilet and I enabled frame generation, When he came back I said to him -

"I applied a preset overclock I have to the card, Should run a little smoother now"

He sat down, Resumed playing and said -

"Damn this is smoother, Didn't know overclocking your card could make such a big difference"

I haven't told him yet.... :D
 
I did a test a few weeks back with a friend of mine, He is quite the PC novice but he's been on various PC gaming social media pages long enough to get sucked into the cult that irrationally hates frame generation or "fAkE fRaMeS".

He came round and really wanted to play Spiderman Remastered as his rig isn't up to the task at decent settings, He was really enjoying himself but I had frame generation disabled.

He went to the toilet and I enabled frame generation, When he came back I said to him -

"I applied a preset overclock I have to the card, Should run a little smoother now"

He sat down, Resumed playing and said -

"Damn this is smoother, Didn't know overclocking your card could make such a big difference"

I haven't told him yet.... :D

Hey, there you go! :D I think more blind tests are in order to check ourselves. Audio nerds occasionally do it. They'll go to a guitar shop and pick up a Korean PRS for £1000 and an USA-made PRS for £4000 and blind test them. Invariably they walk away humbled and reassess their obsession with buying a £4k USA PRS.
 
Back
Top