I was more referring to his text regarding overclocking the cards and that they are the limit etc.
They are not at their absolute limits, no. That is what the 600w is for. However, as usual you need to think about whether it is worth it. And it is not.
At stock my 2080Ti uses little power (the Kingpin). However, it has three 8 pins for a reason. Because once you start going mad, and or shunt mod it, you can use a lot more power. The problem is that for every 50mhz more you ask of it the more and more power it will start to guzzle because the design has gone out of the window. IE, for like 400mhz more than stock under LN2 my 2080Ti will consume over 500w.
No company would be stupid enough to push things to their utter, utter limits..... Would they? well yeah, they have before. A lot of 30 series cards were running 50mhz too much and were crashing. The fix was a new firmware that reduced the stock boost clocks. They were never in any real danger, though. Any company doing that would be buried with RMA. But once again that is the sort of cheating you get when competition is around. IE, don't ever expect your card to be as good as what you see in a review, because you can bet that Nvidia/AMD bent all of the rules for those reviews.
Like any other computer product overclocking is no more. TBH? I do understand why. Because those companies were giving away free performance for the touch of a few buttons. Why not do it yourself, look better in sales benchmarks and charge for it?
So these days the tweaking is not more performance. They have already done that. The tweaks are finding out how much less power you can make it use. And that will depend on BINs and everything overclocking used to depend on. Chip quality etc etc.