RDNA 3 GPUs will not use the 12VHPWR power connector - AMD confirms

This whole thing could be solved if the cables were properly built and flexible, The ones you get in the 3090Ti/4000 series boxes are badly designed.
 
Insane to have 600 watt cards to begin with.

I personally like that Nvidia have pushed the envelope that high. It gives consumers a choice. The 4090 can be quite efficient if you're willing to spend a bit of time with it. I've said this before, but overclocking is kinda dead now; it's all about undervolting. Companies are pushing their cards way past their sweet spot. But that doesn't mean you are or stuck with that. We can bring back them back into that sweet spot if we wanted to. I mean, ADA Lovelace is going to be in mobile devices as well. That tells you that 600W is just a bragging rights claim. Many have been running 4090/12900k systems on 850W PSU's by spending 10 minutes adjusting a few sliders.
 
Well yes they are pushing them past their limits, but if the goal is undervolt and hold stable lower clocks, that reduces performance, when people are buying cards based off "over" the sweet spot.

Catch 22. Some cards might reach those clocks at low voltage. Most won't.
 
I personally like that Nvidia have pushed the envelope that high. It gives consumers a choice. The 4090 can be quite efficient if you're willing to spend a bit of time with it. I've said this before, but overclocking is kinda dead now; it's all about undervolting. Companies are pushing their cards way past their sweet spot. But that doesn't mean you are or stuck with that. We can bring back them back into that sweet spot if we wanted to. I mean, ADA Lovelace is going to be in mobile devices as well. That tells you that 600W is just a bragging rights claim. Many have been running 4090/12900k systems on 850W PSU's by spending 10 minutes adjusting a few sliders.

Well yes they are pushing them past their limits, but if the goal is undervolt and hold stable lower clocks, that reduces performance, when people are buying cards based off "over" the sweet spot.

Catch 22. Some cards might reach those clocks at low voltage. Most won't.

It's the same with cars, you wouldn't buy a supercar just to lower it's performance. At that point, you can just buy the lower tier car from the same manufacturer for cheaper lol.
 
It's the same with cars, you wouldn't buy a supercar just to lower it's performance. At that point, you can just buy the lower tier car from the same manufacturer for cheaper lol.

In the vasy majority of cases undervolting/underclocking GPU's only gives a small performance hit of a couple percent.

I undervolted my 3090Ti and saw a power drop of 100w but a perf drop of maybe 2-3FPS.
 
In the vasy majority of cases undervolting/underclocking GPU's only gives a small performance hit of a couple percent.

I undervolted my 3090Ti and saw a power drop of 100w but a perf drop of maybe 2-3FPS.

Yeah this. Remember, they are wringing the card's neck to get every single FPS out of it. So even that 2-3 matters to Nvidia in benchmarks.
 
In the vasy majority of cases undervolting/underclocking GPU's only gives a small performance hit of a couple percent.

I undervolted my 3090Ti and saw a power drop of 100w but a perf drop of maybe 2-3FPS.

I was more referring to his text regarding overclocking the cards and that they are the limit etc.
 
I was more referring to his text regarding overclocking the cards and that they are the limit etc.

They are not at their absolute limits, no. That is what the 600w is for. However, as usual you need to think about whether it is worth it. And it is not.

At stock my 2080Ti uses little power (the Kingpin). However, it has three 8 pins for a reason. Because once you start going mad, and or shunt mod it, you can use a lot more power. The problem is that for every 50mhz more you ask of it the more and more power it will start to guzzle because the design has gone out of the window. IE, for like 400mhz more than stock under LN2 my 2080Ti will consume over 500w.

No company would be stupid enough to push things to their utter, utter limits..... Would they? well yeah, they have before. A lot of 30 series cards were running 50mhz too much and were crashing. The fix was a new firmware that reduced the stock boost clocks. They were never in any real danger, though. Any company doing that would be buried with RMA. But once again that is the sort of cheating you get when competition is around. IE, don't ever expect your card to be as good as what you see in a review, because you can bet that Nvidia/AMD bent all of the rules for those reviews.

Like any other computer product overclocking is no more. TBH? I do understand why. Because those companies were giving away free performance for the touch of a few buttons. Why not do it yourself, look better in sales benchmarks and charge for it?

So these days the tweaking is not more performance. They have already done that. The tweaks are finding out how much less power you can make it use. And that will depend on BINs and everything overclocking used to depend on. Chip quality etc etc.
 
Sorry, thought he had written along the lines of being at their limit, when he actually said "Companies are pushing their cards way past their sweet spot." - So that's what I was meant to say and not "that they are at their limit etc".
 
It's the same with cars, you wouldn't buy a supercar just to lower it's performance. At that point, you can just buy the lower tier car from the same manufacturer for cheaper lol.

Yeah, that's true. But that's not quite what I meant.

A 4090 doesn't have to draw as much power while performing effectively the same.

If you bought a supercar with 600hp and found out that it could do 580hp at a 20% reduction in fuel by changing a setting in the car's computer and swapping the second turbo's air filter, many enthusiasts would do it. In fact, some garages would offer it as an aftermarket option.

I'd like to see AIB partners do that; create smaller more efficient versions of graphics cards. They wouldn't sell of course because... graphz. But that's what I'd like to see.
 
In the vasy majority of cases undervolting/underclocking GPU's only gives a small performance hit of a couple percent.

I undervolted my 3090Ti and saw a power drop of 100w but a perf drop of maybe 2-3FPS.

Both my 3070 and 3080 12Gb are running overclocked and undervolted, less heat, less noise and a huge drop in power consumption.

You certainly don't notice any loss in performance from either.
 
Last edited:
My 3070ti at about 50mhz less clock than auto, dropped 10C in temps and according to software saved me 50 watts in power draw. Performance is identical.

Big deal in a SFF build with limited airflow and large PSU options(at the time of build)
 
Back
Top