RTX 5080 3DMARK Benchmark Scores Leak

WYP

News Guru

3DMARK Time Spy Scores for Nvidia's RTX 5080 GPU have leaked.​


2345.jpg


Read more about Nvidia's leaked RTX 5080 benchmark scores.
 
Seeing as it's unlikely to find a 5080 at $1000, the performance uplift from the 4080 Super is pretty bad if these tests are indicative of gaming benchmarks. And if you managed to get a 4090 in the early days at its earlier price, the 5080 at its likely real-world price at launch will be the same performance per dollar.
 
I think the shader count is incremental across the board. Which is why the 5090 too in raster is pretty poor compared to the 4090. Whatever % you get in uplift you are paying for.

Nothing surprises me any more tbh. This is what happens when one company competes with itself. It will be 5% next gen like when Intel used to dominate the CPU space. Only for more and more money each time.
 
I think the shader count is incremental across the board. Which is why the 5090 too in raster is pretty poor compared to the 4090. Whatever % you get in uplift you are paying for.

Nothing surprises me any more tbh. This is what happens when one company competes with itself. It will be 5% next gen like when Intel used to dominate the CPU space. Only for more and more money each time.

In fairness, Intel weren't just deliberately holding back to milk consumers, though there probably was some of that going on. They were failing at silicon manufacturing and chip design. Nvidia aren't. They are still making incredible devices. I don't mind these performance uplifts, especially given they're on the same node roughly. What I resent is the VRAM capacity and prices people are expected to pay for such modest uplifts. DLSS 4 and all these new optional extras should be just that--optional extras that enhance rather than replace. That's how I see it.
 
They are the same silicon and it's now an older node and that means more profit these aren't decent value at all the 5090 is a better design but outside of that if you we're on a 4090 other than design their is no real gain that you'd need to upgrade.

Sure if you are on a 2k series but if you own a 3090 yet again wait.

That's why nvidia lock software to generations to try to get you too upgrade for zero reasons.

So 20% on a 4080 to 5080 in actual game fps at exactly same settings not the 4x frame gen isn't going to be 20% maybe 15%.

The 9070xt is of interest depending but tbh I'm general feeling is the whole generation is a skip.

The issue here is silicone nodes they are hitting the brick wall and in time the prices must and i do mean must come down.

The reason is unless new nodes using new materials are created they have no means too progress further outside of software.

That is fact and if they try to justify otherwise when that wall is hit it'll drastically crash the market.

Given that Nvidia is the near whole value of the market then you best guarantee that the markets are soon too crash and that is why the whole market and inflation is as an all time high cause it's going to go pop.

Very very simple to understand hence why they are prepping hard for AI and the digital currency cause when the market crashes like it should have in 2008 more so, it's going to be chaos.

Thats why you have resource wars and them pumping the markets to bleed it and yea that huge investment into AI underground super computers and power to get the systems in place for the colapse.

That is fact jack.

So nah soon wont be ever upgrading as there will be nothing to upgrade too unless they manage to find a way to make bigger dies or more chiplet or new materials.

So technology is getting old and atm outside of graphene i don't see much going on with silicone cause it's pretty much at it's limits.
 
What are the chances of that, where i then start catching up on my yt and nvidia dropped alot, you best all be more aware on why cause i been calling march this year since over a year ago for many reasons and be sure I've got my facts right cause i know a few folks in some strange places.

Definitely not what i wanted to be seeing this morning 😕

Sometimes being right what you actually want is to be completely wrong 💯
 
In fairness, Intel weren't just deliberately holding back to milk consumers, though there probably was some of that going on. They were failing at silicon manufacturing and chip design. Nvidia aren't. They are still making incredible devices. I don't mind these performance uplifts, especially given they're on the same node roughly. What I resent is the VRAM capacity and prices people are expected to pay for such modest uplifts. DLSS 4 and all these new optional extras should be just that--optional extras that enhance rather than replace. That's how I see it.

Intel were failing to shrink. That was literally it. Bear in mind you could get Haswell CPUs with 18 cores, yet they were selling 2 with HT for £185.

Same goes with Nvidia. The 80 super cards last gen were literally made just to keep the prices the same as the older 80 cards with barely any uplift at all and no 90 super.

It will just get worse. BTW due to this stagnation in the GPU market? that is why they are leaving you teetering with VRAM. When GPUs stagnate so do games and pushing games as devs can no longer rely on brute force. So Nvidia need a way to get you coming back in 2 years rather than holding onto your GPU for 5 years. And skimping on vram is the best way to do that.

I sometimes wonder if that is why they sponsor and support game devs. So that they know what is coming and it helps with their marketing and knowing how to screw us over.

BTW. Also note how in 8 years AMD have stayed on the same core amounts since day one. TR was more of a HEDT thing then than a server chip, so yeah. They have gone higher but look at what that costs now. I think you will start to see the CPU space stagnate soon. If Intel flop again? yeah expect to be paying £500 for an 8 core CPU.
 
That's really quite pathetic. It seems that Nvidia has hit the limit of what their architecture can really do, and they're now dependent on higher TDP and gimmicks like frame generation.
 
That's really quite pathetic. It seems that Nvidia has hit the limit of what their architecture can really do, and they're now dependent on higher TDP and gimmicks like frame generation.

They could do better but they are now an AI company so everything is going that direction sadly.
 
They could do better but they are now an AI company so everything is going that direction sadly.

IDK how long many people have been on here. I know you have been here years.

I said corr, about 13 years ago that Nvidia were struggling bad. Not on gaming GPUs. The problem was that gaming GPUs don't make them enough money, nor could grow them as a company. Look at all of the things they tried.

Physx. Failed.
DX10. Failed.
DX11. Failed (and what I mean by failed is them trying to grow their market).
Their mobile chip. Brain fart, forget the name. Tegra? failed.
Gsync. Failed. AMD made sure of that (and by failed I mean AMD released the same sort of thing for free without their rip off module)
Tried to buy numerous companies. Failed.

And so on. So many times they have bought things (like SLi) and made absolutely nothing from it. Or it died out and etc.

As soon as they went into a share holder company they needed to grow. Shareholders want to see growth, and gaming GPUs is a one trick pony.

So as I said before RTX is all smoke and mirrors. It was never about ray tracing AT ALL. That is just what they sold to gamers as a pipe dream so they could charge much more for something people don't need (see the things above) whilst investing that money into the real thing they developed those cores for - AI. They did not just wake up one morning and go "Oooo, we can do AI with these tensor cores !". They knew all along that is what they were going to be for.

How long has RT been a thing now? ages right?. How many killer apps do we have for RT? one lmfao. Cyberpunk. That is literally it. And it is as old as the hills.
 
Back
Top