wattage 770sli or 780sli from gigabyte or msi.

Dang3r0us

New member
watt minimum wattage i need to run gigabyteor MSI 780SLI or the 770SLI.. saw some different websites and so and its all different minimums..

if i could run 2 x 780 Gigabyte's with my AX 860(I) i can run 2 x 770 also because it uses less wattage. i did see gigabyte windforce 3x uses more then SLI TF so. lemme know thx
 
I don't see what you worried about? You say you have an 860watt psu? That could run 2 titans overclocked along with a CPU overclock as well.
 
review on the website i wanna buy say's minimim for 1 card was 650.. thats why i was worried. but now i know i'm maybe gonna get 2 x 770 not sure what ill do but thx;)
 
GTX 780's have a maximum TDP of 250 Watts which means when running at 100% utilisation (think Furmark) it will not go above 250 Watts. This is before overclocking but don't expect it to ever go above 300 for each card.

If you go SLI, you're looking at a maximum of 500 Watts draw for both cards combined. And even then while gaming it is much more likely to be around 400 or less as the cards will rarely be completely maxed out.

In short, your PSU is completely safe to use. The reason that some websites will give a 650 Watt estimation is because they are taking in to consideration the entire system, not just the GPU's draw. And they expect if you're going with a card that pricey with that kind of power draw you'll likely be filling the computer with other power guzzling components and 650 is a very safe bet.
 
oke thx now ill pick a card 780 lill expensive for 1 card and ill could just buy titan then so maybe 770 or wait for new amd series thx for info!
 
GTX 780's have a maximum TDP of 250 Watts which means when running at 100% utilisation (think Furmark) it will not go above 250 Watts. This is before overclocking but don't expect it to ever go above 300 for each card.

If you go SLI, you're looking at a maximum of 500 Watts draw for both cards combined. And even then while gaming it is much more likely to be around 400 or less as the cards will rarely be completely maxed out.

In short, your PSU is completely safe to use. The reason that some websites will give a 650 Watt estimation is because they are taking in to consideration the entire system, not just the GPU's draw. And they expect if you're going with a card that pricey with that kind of power draw you'll likely be filling the computer with other power guzzling components and 650 is a very safe bet.

Not true. GPU BOOSt 2.0 will overclock the cards as far as possible until it reaches the thermal limit. The advice you give is really only good for earlier cards. Cards will almost always go above their rated TDP because the slightest manufacter OC will use more power. TDPs are the theoretical maximums because of the combined wattage from PCI slot and 8/6pins. Thats how they got the number but it is possible to to use more thn 250 watts.
 
Not true. GPU BOOSt 2.0 will overclock the cards as far as possible until it reaches the thermal limit. The advice you give is really only good for earlier cards. Cards will almost always go above their rated TDP because the slightest manufacter OC will use more power. TDPs are the theoretical maximums because of the combined wattage from PCI slot and 8/6pins. Thats how they got the number but it is possible to to use more thn 250 watts.

Quote Me:

This is before overclocking but don't expect it to ever go above 300 for each card.

And really with GPU Boost 2.0 the cards stick to the TDP of 250 Watts unless you choose to increase that limit. My advice is sound and accurate.
 
And that changes nothing? Never said it would not go above 300.

GPU Boost will go until it reaches the thermal limit of what it's set to. It will push for the highest/stable overclock possible until it reaches at 80c(default). Your advice is sound and accurate for previous cards. Some of what you said is however true which would be the last paragraph in my qoute.
 
And that changes nothing? Never said it would not go above 300.

GPU Boost will go until it reaches the thermal limit of what it's set to. It will push for the highest/stable overclock possible until it reaches at 80c(default). Your advice is sound and accurate for previous cards. Some of what you said is however true which would be the last paragraph in my qoute.

Actually with GPU Boost 2.0 the power consumption is taken in to account too. I actually own two GTX 780's and know what I'm talking about. :rolleyes:

V4pv1vs.png


EDIT:// I don't want to leave this on a hostile sounding post so I will eloberate on exactly how the GPU Boost 2.0 works and the TDP.

On the Titan and GTX 780 they both share a TDP of 250 Watts. That is the combined power of the PCIe Slot the 6 Pin and the 8 Pin PCIe Power Connectors. You know this already.

But what I think you don't know is that the cards do not pull anywhere near 250 Watts when gaming. It's more like 180-190 Watts. That leaves a lot of room for the clocks to be pushed higher by GPU Boost 2.0 when the temperature is below 80c. Only apps like Furmark push the card to 100% Power utilisation.

Now if your card is only like 60c but the Power usage is already 100% (250 Watts) it will not overclock the GPU. It won't go any higher. That is why in the EVGA Precision program (which I screenshoted above) you can adjust the power target. NVIDIA allows you to raise it to 106%. If you choose to raise it and keep it linked then the temperature target raises from 79c to 95c but if you're water cooling there is little point to raising the temp target only the power one.

I hope this clears up the confusion around GPU Boost 2.0
 
Last edited:
I knew all that dude.Power Target is set at 100%(250watts) so the card will try to automatically get to full gpu usage while still mainting the thermal limit set by EVGA software. I think you don't understand that part.
 
just needed to know if i could run 2 cards on the 860 watt unit :p nice prog vicey to overclock wich one is it if i can ask? i got msi afterburner.. but have asus card if i'm right:p not working good haha but thinking of going for gigabyte now
 
Back
Top