WYP
News Guru
70% performance gains over Nvidia's RTX 4090 are expected.

Read more about Nvidia's alleged RTX 5090 specifications.

Read more about Nvidia's alleged RTX 5090 specifications.
Last edited:
All aboard the hype train choo choo!! 70% uplift in one generation? So is that Nvidia throwing cores as a solution? have they found a nicer architecture? or is it that 2x through 4x were pretty crap and they've decided to let the 'A' design team have a play on the gamer designs this gen as a reward for the stellar Enterprise work they've done.![]()
Well we've got roughly a year and a few months until they release the new cards, A lot can happen in that time.
From what I've read they are using advanced AI to help them design GPU architectures that will let them get the most optimized designs.
Advanced AI?? AI needs to be led, needs to be trained. Here you go AI, design me a future state GPU - go to the reddit/nvidia forum for specs...
Edit - I'm sorry, sounded harsh but i'm seeing AI tagged onto everything and anything - My initial premise still holds though - 70% is a big lift, where though? in Raster, in RT? at what resolution?
Nvidia AI is already fault finding code and developing its own code to be more efficient. if its 70% because of AI coding, I would actually believe it. AI is rapidly advancing last 2 years, maybe more than I would like.
I wonder how much of that 70% is using DLSS and Frame Gen, after all nVidia, AMD and Intel these days are only interested in using Fake Frames to sell cards, not native rendering.
You may be speaking in hyperbole, but these companies can't keep cramming cores into tiny spaces. They have to use intelligent means to push performance. A frame is already a 'fake' frame. It's just digital information. I want FSR and DLSS and XeSS to succeed. I see no drawbacks that can't be resolved via continuous innovations in the technology. If we gain performance through upscaling or other 'fake' means, what difference does it make? DLSS 3 has issues, but what if those issues were resolved? Would you still consider it to be 'fake', or an inferior version of the real thing?
You may be speaking in hyperbole, but these companies can't keep cramming cores into tiny spaces. They have to use intelligent means to push performance. A frame is already a 'fake' frame. It's just digital information. I want FSR and DLSS and XeSS to succeed. I see no drawbacks that can't be resolved via continuous innovations in the technology. If we gain performance through upscaling or other 'fake' means, what difference does it make? DLSS 3 has issues, but what if those issues were resolved? Would you still consider it to be 'fake', or an inferior version of the real thing?
Sorry but to me whilst DLSS is good in allowing older cards to stay usable, it is also deceiving people in to thinking a newer graphics card can perform better than it can.
DLSS is not the same as Frame Generation though. DLSS outputs near native resolution, it's not new it's using a long time method but with modern technology to give it even better benefits.
Frame Generation otoh, is completely made up data that realistically makes the fps appear bigger but it's not actually improving the game other than reducing time between native rendered frames, making it appear smoother visually with no benefits to as far as I'm aware to improving input latency and similar akin to what it would be at that actual rendered fps. In fact input latency is actually increased using it.
Sorry but to me whilst DLSS is good in allowing older cards to stay usable, it is also deceiving people in to thinking a newer graphics card can perform better than it can.
nVidia and I believe AMD have been marketing cards as "4K Triple A Ultra Quality Gaming Capable" but they are only capable of doing it with DLSS/FSR, so not really "Capable" of doing it, just like the was it 3090 as "The worlds first 8K Gaming GPU" but only when rendering a potato quality game at 4K natively and upscaling to 8K, because as soon as you render at 8K natively your drop to single digit FPS.
That to me is "Fake Performance" because if the card cannot render the game at 4K natively, then it shouldn't be marketed as such, but they do get marketed as that because of DLSS/FSR.
Then you have "Frame Gen", well NBD said why that is a problem.
I know that the likes of nVidia, AMD and Intel cannot just keep "adding cores" but they also cannot just keep adding "Upscaling will solve everything", and increasing pricing, giving us effectively lower performing cards for twice or 3 times the money thanks to "Upscaling".
I just don't think people should be happy about having to rely on DLSS/FSR in order to make a game playable especially on "High End" cards.
Deception is down to marketing and consumer ignorance, not technological advancements. I totally understand your frustration with the marketing of it; I agree with you. But I still believe a brighter future is in optimising upscaling and frame generation technologies over brute forcing cores and die sizes. That's really what I was trying to say.
I wouldn't be happy to rely solely on DLSS/FSR/XeSS or frame generation of any kind either—but we're not. The standard 'core for core' performance uptick of the 4000 series is well in line with previous generations. The issue is the pricing and the marketing. Again, that's not architectural. DLSS, frame gen, they are on top of an improved performance metric. The issue with Nvidia's 4000 series is the pricing, VRAM capacity and bandwidth, and marketing; it's not raw performance or upscaling/frame gen.
Yep. I used DLSS as a broad description just like Nvidia have done. Which is kind of annoying really because like you said they're different methods. But I used both in the same context because they both push frame rates higher, which is the whole point of bigger and faster architectures. And that's my point. However companies push frame rates, I'm down for that. If frame generation has its downsides, and it does, maybe that can be improved upon in future generations? This to me is the same as any other form of progression, 'fake' or 'real'. Like DLSS when it first dropped, frame generation could improve a lot over the next few years.
It's not higher is the point. It's just a marketing technology. Nothing to do with performance other than making it worse. Generating fake frames does not improve anything, which just makes half your point weak. It can be improved sure but until their Tensor cores can become on an equal level to push frames out as fast as the regular Rasterization cores, it's purely marketing. They have a long way to go.
Right now, no, I wouldn't use 'fake frames'. Right now, yes, it's mostly for marketing reasons. But again, I don't want Nvidia or others to stop working on projects like this. I want them to continue to improve them to see if they are genuinely viable as an aid to rasterization for those that would benefit from it.
I did a test a few weeks back with a friend of mine, He is quite the PC novice but he's been on various PC gaming social media pages long enough to get sucked into the cult that irrationally hates frame generation or "fAkE fRaMeS".
He came round and really wanted to play Spiderman Remastered as his rig isn't up to the task at decent settings, He was really enjoying himself but I had frame generation disabled.
He went to the toilet and I enabled frame generation, When he came back I said to him -
"I applied a preset overclock I have to the card, Should run a little smoother now"
He sat down, Resumed playing and said -
"Damn this is smoother, Didn't know overclocking your card could make such a big difference"
I haven't told him yet....![]()