AMD's next-gen RDNA 3 GPUs will be more power hungry, expect Nvidia's to be worse

This comes to show 2 important things:

1st: Nvida and AMD built very efficient GPUS in the past so that miners would see this gpus as really profitable in terms of calculations in mining per W of use. This took all those RTX 3060 and ti with 3070/ti to the top of profit mining. Also AMD with 6600/XT and even 6700/XT.

2º That there is a point in the technology where you cannot increase the performance while reducing the power consumption. This was made in the past to favour miners, but now that mining with GPUS is almost dead, there is no incentive to efficiency, just for raw performance which is, in the end, what most gamers want.

This is why all the reports about power consumption give less efficiency and plus more performance for both AMD and NVIDIA. They just don't care anymore about miners and have decided to push hardware to the limits at the expense of more power consumption.
 
This comes to show 2 important things:
2º That there is a point in the technology where you cannot increase the performance while reducing the power consumption. This was made in the past to favour miners, but now that mining with GPUS is almost dead, there is no incentive to efficiency, just for raw performance which is, in the end, what most gamers want.

Fancy ourselves conpiracy theorists aren't we?

1st: Unless you're an Alaskan gamer I'm pretty sure you'd rather have your PC components not heat up your room above the summer heat in the equator every time you want to play a game, not to mention, even with big heatsinks, localized heat still affects solder joints and causes premature cold solder-based component deaths, also remember exploding 3090s? Yeah, it was a sight to behold, and expect it to again with RTX 40 but probably affecting more SKUs. Oh and new power supply, and that electricity bill?

2nd: This is not new, and not caused by mining... Mining with GPUs was not a thing before the current generation, they couldn't just predict it would boom because they mande efficient GPUs, they could hope so, but not predict... Efficient GPUs have been a thing for ages, and sometimes warrants exclusive products like the GTX 750Ti that NVIDIA literally launched a year before the achitecture was officially released (it was launched in a more modern architecture than any of the other GPUs in the 700 lineup). Why? Wel...

If you do a quicky google search for GTX 480 you'll see many complaints about heat and high power usage, it was just another moment where NVIDIA fell behind AMD and they couldn't do anything aside from incresing power usage and bumping up everything they could, look at GTX 200 series some of those GPUs had 512-bit memory bus and still couldn't beat AMD's offerings, the GTX 400 series was NVIDIA's answer and was lauched early and incredibly power hungry for the standards from back then.

After mitigating their power issues within the future GTX 500 and finally fixing it with 600 series NVIDIA said their TOP of the line would never exceed 250W... Well. See how that's working for them?

Point is they've been working on power efficiency since the GTX 400 fiasco (actually since before but way harder after the GTX 400 series) because those cards died, and oh how they died, look up at ebay see how many GTX 480s you can find, expect the same with RTX 40 series and also the 3090. This is not a stable solution also not a long-term solution, NVIDIA simply could not finnalise their chiplet architecture in time to compete against AMD, they felt the pressure in the 30 series and had already raised the power draw to combat it, the 40 series will be worse because it's pretty much the same story, but do expect in the RTX 50 range for GPUs to come with way lower than expected power draw, probably by RTX 60 series they will be remarkably efficient.

But yes, to continue pushing the envelope and developing ever more realistic games we have already resorted to all sorts of software trickery, like temporal AA, GI, Reflections, etc. And no, I don't expect to see RTX 6090 with 250W TDP, the way GPUs need to be designed to offer the amount of performance we expect including 4k, 144hz+ refresh rates all the while increasing graphical fidelity to ever greater heights is taking it's toll and causing the power draw and cost of PC components to rise, and that's a big part of why many companies are investing so much into cloud gaming, it's coming to a point it's beign ever more prohibitive to buy hardware, including consoles, so instead, if a big company takes the upfront investiment and we just pay for the rights to use it, it's a solution. But do expect the unbelievable power draw of current and next generation GPUs to slow down and lower quite a lot in the next few years.

So basically this is a cycle, NVIDIA would much rather keep pushing smaller but still meaningful improvements every year it's less costly for them, for their image and, of course, avoids exploding and early-dying products, but AMD was loing the game and they had to do something about it, they just disrupted the market so much consoles basically reached PCs, and NVIDIA couldn't have that, they didn't supply the chips in the consoles they woudl lose so many customers, they had to do something about it. This is it, this is a an desperate attempt to remain relevant, one that I hope will come to pass, and who knows maybe it takes a few years and AMD comes out on top of NVIDIA for a while, but they'll eventually figure it out and we will ultimately benefit from their clash.
 
Back
Top