1,000W PSUs are recommended for overclocked RTX 3090 Ti models

Yeah as soon as I saw the Kingpin using two of those dumb Intel connectors I figured stuff was going to get stupid.

Ahh, technology. Dragging us 9 years backward.
 
Yeah as soon as I saw the Kingpin using two of those dumb Intel connectors I figured stuff was going to get stupid.

Ahh, technology. Dragging us 9 years backward.


Actually no. For every single GPU generation you could hear "Nvidia locked the power of this cards." 3080 and lower cards exist and they are plenty for most people, and they have reasonable power consumption.

Flagships should always be balls to the wall, unlocked, unconstrained. It is good that we are getting full power of the latest flagship silicon.
 
Yeah as soon as I saw the Kingpin using two of those dumb Intel connectors I figured stuff was going to get stupid.

Ahh, technology. Dragging us 9 years backward.

To be fair these cards are not meant for the 99.99% of users, They are meant for people looking to break world records on LN2 that need unrestricted power, For the majority, Stock and supply issues aside, Exists the 3060/70/80 which are all very efficient especially when you compare the performance plus total wattage to the last gen, Quite nice gains.

The Intel connector is also pretty smart, Linus did a video about the implementation around a year or so ago and the power savings it can give, If the PSU and motherboard have this standard too, Are actually fairly impressive.
 
Last edited:
To be fair these cards are not meant for the 99.99% of users, They are meant for people looking to break world records on LN2 that need unrestricted power, For the majority, Stock and supply issues aside, Exists the 3060/70/80 which are all very efficient especially when you compare the performance plus total wattage to the last gen, Quite nice gains.

The Intel connector is also pretty smart, Linus did a video about the implementation around a year or so ago and the power savings it can give, If the PSU and motherboard have this standard too, Are actually fairly impressive.

No, you are right. These cards are not meant for us mere mortals. However. Ampere is a crap node. If you think I am wrong then I really don't know what to tell you. Power use should go down, and clocks should go up. That is good technology. Fermi - Kepler - Maxwell - Pascal. All did exactly that. Pascal hit 2ghz easy. The fact these are on a lower process node and guzzle power and don't clock any higher is testament to that.

AMD's shrink was how a shrink should be. If Ampere wasn't so terrible they would have buried AMD. Imagine Ampere, but at 2.5ghz and beyond. That is what should have happened. Only instead we have storage heaters.

Again ask yourself this. Where is the Titan? I answered that at launch. There isn't one and there won't be one because Ampere is so bad that it would be an insult to release a card that chews down over 500w.

If you think these are good? wait until they go back to TSMC. That is my point. You'll see.

Actually no. For every single GPU generation you could hear "Nvidia locked the power of this cards." 3080 and lower cards exist and they are plenty for most people, and they have reasonable power consumption.

Flagships should always be balls to the wall, unlocked, unconstrained. It is good that we are getting full power of the latest flagship silicon.

Nothing has improved since they "unlocked the power". Nothing. At all. You could still easily overclock a 1080ti to 2ghz and more on water. Power was not preventing anything at all, apart from RMA. That is why they did that, not to stop you overclocking or having fun. In fact, from the moment they locked the power on cards? that was when the automatic overclocks started. AKA "Boost frequency" because that is basically what it is, automatic overclocking. You could push further, and you could easily reach a safe limit where your card would not go any faster and crash - because of power and heat. Nothing more. It was literally done to stop people cooking their cards and returning them.

I can unlock the voltage on my 2080Ti very easily. The problem is firstly it will void my warranty, but secondly there is absolutely no point AT ALL. My card tops out around 2180mhz on water, and that is all it has to give. Unlocking the voltage would not do anything unless I wanted to put it under LN2, which is totally impractical. I want to game, not set benchmark records.

The reason they have "unlocked the power" on Ampere is because they had no choice. Like I said to Dice, as impressive as you may feel it is the power it uses is an absolute insult. However, it is also the reason they unlocked the power. Otherwise it would, in order to meet sensible power use levels, be slow and crap. Because it's a crap node.

I tell you what. Debate with me all you like. Fair play. However. You wait until they go back to TSMC. All of a sudden they will be bragging about how efficient their cards are.
 
Last edited:
The fact AMD didn't only catch up in rasta but go past them says it all, the nvidia lineup while in demand and popular is not a good result they are as good an upgrade as fermi i.e poor, they will go back to tsmc cause if they don't AMD is going to stomp all over them.

I'll say it here now, nvidia might be eating dust by the end of the year they gave AMD an inch and while AMD didn't take full advantage they sure gained a lot of ground I've had 2600Mhz on my card some are able to push higher everyone is so worried about DLSS and DXR that AMD haven't been given a fair crack of the whip by most but the 6000 series is a beast for AMD.

But in the end the power usage of the cards while bad will go higher next gen on both sides so i'd actually expect it to get much worse, but Nvidias cards were in no means a good node shrink at all very far from it.
 
If they get worse then there will be no cooling them.

This card uses 450w. That's insane.

TBH when they go back to TSMC you will see a huge power draw drop. Even with mental clocks.

They were being cheap asses.

When Intel enter the arena the gloves will be off. Their marketing machine will steamroller AMD and Nvidia ought to be very scared.
 
Intel will be Intel they are going to spend big bit like MS with Xbox.

They will get worse I feel power wise the newer GPU's coming are going to dwarf my 6800XT it'll go to mid range power, nvidia will be better the 2nd next gen coming as atm they plan to go one big die until they go chiplet they will suck power.

I can very much see 500Watt + maybe at the top end 600watt in next gen, thou at least they will be powerful cards as for the prices get ready for a heart attack as i feel they will keep producing the current gen at the same time either they will fill the midrange and lower end products or keep it as 2 series wouldn't be the first rebrand in our history.
 
Nothing has improved since they "unlocked the power". Nothing. At all. You could still easily overclock a 1080ti to 2ghz and more on water. Power was not preventing anything at all, apart from RMA. That is why they did that, not to stop you overclocking or having fun. In fact, from the moment they locked the power on cards? that was when the automatic overclocks started. AKA "Boost frequency" because that is basically what it is, automatic overclocking. You could push further, and you could easily reach a safe limit where your card would not go any faster and crash - because of power and heat. Nothing more. It was literally done to stop people cooking their cards and returning them.

I can unlock the voltage on my 2080Ti very easily. The problem is firstly it will void my warranty, but secondly there is absolutely no point AT ALL. My card tops out around 2180mhz on water, and that is all it has to give. Unlocking the voltage would not do anything unless I wanted to put it under LN2, which is totally impractical. I want to game, not set benchmark records.

The reason they have "unlocked the power" on Ampere is because they had no choice. Like I said to Dice, as impressive as you may feel it is the power it uses is an absolute insult. However, it is also the reason they unlocked the power. Otherwise it would, in order to meet sensible power use levels, be slow and crap. Because it's a crap node.

I tell you what. Debate with me all you like. Fair play. However. You wait until they go back to TSMC. All of a sudden they will be bragging about how efficient their cards are.


I agree that Samsung's node is a bit of a dud for this generation. But that was not my point. It doesn't matter if it is only 5% performance gain for 50% of extra power flagship class cards should let you do that. The whole die, all the memory, and all the power. Like with ROG Extreme boards. Barely anyone buys them, they give you no extra performance for a lot of extra money, but they exist and they have everything and more for someone who wants them. That is what flagship should be.
 
The issue that I find with this is that it sets a bad precedent regarding efficiency. Once they set a foothold in the top end about a terribly inefficient product, it will trickle down and then all of a sudden, we're back to using multiple power connectors on midrange hardware again whereas in when this day and age, they should top out at one external power connection and no more than that.

To me, a flagship should be the product that showcases everything that line has to offer, efficiency being included. It's no good having the flagship scream performance while guzzling a ridiculous amount of power to do so because all that does is showcase how that product line scales terribly.

To me, this situation is a lot like cars. If you want to mod your engine for more performance while throwing fuel efficiency down the toilet, go ahead, you're free to make that choice. But the manufacturer shouldn't have that as the default, it should be properly tuned to meet the best performance/efficiency balance possible then the users can decide beyond that. Having it readily available just creates a mess that I've explained in my first paragraph
 
I agree that Samsung's node is a bit of a dud for this generation. But that was not my point. It doesn't matter if it is only 5% performance gain for 50% of extra power flagship class cards should let you do that. The whole die, all the memory, and all the power. Like with ROG Extreme boards. Barely anyone buys them, they give you no extra performance for a lot of extra money, but they exist and they have everything and more for someone who wants them. That is what flagship should be.

You've covered it yourself there. 5% performance gain for 50% more power. As you go higher it gets even worse. By the time you get a 2080Ti to 2400mhz it's consuming about 600w. There are only about three PCBs, IIRC, on 2080Tis that can provide that without catching fire.

Which can't be lived with, no matter how enthusiastic you are. I mean for benchmarks and LN2 etc? sure. But 99% or actually more do not buy their cards for that.

Everything overclocks itself now for competition. If AMD release another dud like Bulldozer? then yes, I am sure Intel will make sure their CPUs beat those and then will leave the rest in the tank for us to play with like they did with the Core 2s and etc. Sadly no one has that sort of lead any more so everything now comes on the ragged edge out of the box. In fact, some board makers etc cheat to make CPUs boost even higher to make their boards look better. *glares at MSI and Gigabyte*.

Hopefully Ampere will just go away before the availability problem solves itself. However, hilariously I reckon it will be Intel who solves that problem, not Nvidia or AMD.

Intel will be Intel they are going to spend big bit like MS with Xbox.

They will get worse I feel power wise the newer GPU's coming are going to dwarf my 6800XT it'll go to mid range power, nvidia will be better the 2nd next gen coming as atm they plan to go one big die until they go chiplet they will suck power.

I can very much see 500Watt + maybe at the top end 600watt in next gen, thou at least they will be powerful cards as for the prices get ready for a heart attack as i feel they will keep producing the current gen at the same time either they will fill the midrange and lower end products or keep it as 2 series wouldn't be the first rebrand in our history.

We don't need world beating GPUs right now. No one does. I stuck at 1440p when I got my 2080Ti (the first one) and I am glad I did. I've noticed about a 20 FPS drop between slightly older titles vs all of the latest new ones. That still leaves me well above 120 FPS with max detail. FC6 ran at around 70, but I played through it without DLSS and besides, it was single player so it was fine.

4k (which seems to be pretty much universally agreed on here) is not worth the bother. As such how much power do you need? about a 3070. Or a 2080Ti which is between that and the 3080 due to it having more VRAM, so in certain scenarios it's better to have than the 3080 (only a couple, mind).

So what are Intel releasing? exactly that. The cards people really want. Not these £2000 jokes. If they can do that? and they get their own fabs knocking them out like crazy? that is why AMD will get steamrolled. Because they have to rely on TSMC who cater to countless other companies in the queue. Even Nvidia can't make their own chips, and right now are relying on Samsung (lol, that was a bad mistake). Even when not using Samsung they use TSMC. But why did they go to Samsung? because it was cheaper, but also because they thought they would get far more silicon. Failure rates are very high though I hear, and the node is pretty crap.

Intel? lol they have 99 problems but making silicon ain't one.

All it will take right now, literally the WHOLE thing for them is to have GPUs on shelves. That people can actually buy. Even if they're a bit crap? people will still buy them because they can. That coupled with Intels marketing machine? yeah, bye AMD in the GPU space it was nice knowing you.
 
Last edited:
The issue that I find with this is that it sets a bad precedent regarding efficiency. Once they set a foothold in the top end about a terribly inefficient product, it will trickle down and then all of a sudden, we're back to using multiple power connectors on midrange hardware again whereas in when this day and age, they should top out at one external power connection and no more than that.

To me, a flagship should be the product that showcases everything that line has to offer, efficiency being included. It's no good having the flagship scream performance while guzzling a ridiculous amount of power to do so because all that does is showcase how that product line scales terribly.

To me, this situation is a lot like cars. If you want to mod your engine for more performance while throwing fuel efficiency down the toilet, go ahead, you're free to make that choice. But the manufacturer shouldn't have that as the default, it should be properly tuned to meet the best performance/efficiency balance possible then the users can decide beyond that. Having it readily available just creates a mess that I've explained in my first paragraph

This, why use should all computing continue to increase its power consumption to this extent, why not all drive 6.0l v8's everywhere all the time and never walk again.
We don't all drive Ferrari's everywhere because we don't need to, but right now there are not enough lower and mid range items available and those that you can get are being sold the price of Ferrari's.
 
The issue that I find with this is that it sets a bad precedent regarding efficiency. Once they set a foothold in the top end about a terribly inefficient product, it will trickle down and then all of a sudden, we're back to using multiple power connectors on midrange hardware again whereas in when this day and age, they should top out at one external power connection and no more than that.

Exactly. Totally, 100%. No one seems to be complaining. In fact, I have seen guys buy a card (even a 3070) and then rush out and buy a bigger PSU as well :o

Like they're not bloody expensive enough as it is, then another £100+ for a PSU then much higher electricity use. But that's OK 'cos Ampere is super duper fast. Not.

For years things improved really well. To the point where you could run an entire mid range rig on 450-500w PSUs. Now you need a 1000w as a minimum for this card. Fork, I wish I had kept all of the honking massive PSUs I had to practically give away because no one wanted them :D

The fact is these cards are literally uncouth. To the point of being blatantly obnoxious. That coupled with the huge energy price hikes in the UK atm? yeah, no thanks like.
 
This, why use should all computing continue to increase its power consumption to this extent, why not all drive 6.0l v8's everywhere all the time and never walk again.
We don't all drive Ferrari's everywhere because we don't need to, but right now there are not enough lower and mid range items available and those that you can get are being sold the price of Ferrari's.

Well I wouldn't mind driving a 6.0l V8 everywhere personally... Good ol' school muscle^_^
 
Back
Top