The next generation of nVidia GPUs, codenamed Maxwell, is upon us. Are they worth the upgrade? We find out.
Read more on the Nvidia GTX980 Maxwell Review
Last edited:
Just skimming through the numbers on this awesome review it looks like 780 Ti owners are going to have to wait for something truly epic.
Apart from the energy efficiency the difference between the 780 Ti and 980 is miniscule.
Read why....... Expecting the last card owners to upgrade everytime is a bit narrow minded IMHO
Rly this is nothing spectacular at all. For something that draws so little power I was expect the temps to be much better then they are currently and performance wise this gets edged out by the R9 290x and 780ti a lot.
Tom : To complete Titan-class cooler GPU's collection, U still need Titan Black and Titan-Z![]()
Time Spent in the Week Gaming = 20hrs max (usually) = 1040hrs anually
Difference in Power from the Wall = 504W-348W = 156W = 0.156kW
Electricity Cost = €0.12/kWhr
Cost Difference in the Year = 0.156 * 0.12 * 1040 = €19.47 /yr
That takes care of the warm up act, NVidia bring on the full fat Maxwell GPUs now please.
EDIT: To put numbers on my "miniscule" claim:
For me, the cost saving in using a 980 over a 290X is just €20/yr, I wasn't expecting it to be that low.![]()
You getting any of these Kaapstad ?
No
My next GPUs will probably be four GM200 cards in the new year.
I wouldn't get too excited about power consumption numbers, since generally electricity is pretty cheap, here it's like €0.12/kWhr. The actual saving you'll make in the year with the number of hours I spend gaming in the week is miniscule, even if you compare the 290X to the 980.
Having said that though, the perf/W numbers are darn impressive. Here's waiting for what AMD have up their sleeves. Hopefully we'll have an exciting back and forth between Nvidia and AMD again, just like last year. ^_^
EDIT: To put numbers on my "miniscule" claim:
For me, the cost saving in using a 980 over a 290X is just €20/yr, I wasn't expecting it to be that low.![]()
It's not about the money you save on electricity, that is such a closed minded view point. It's about the fact that power usage has been going up and up over the years, back in the day a 400W power supply could run a high end PC, now you need 600W to reliably run a GTX780 or R9 290 based system, 650-700 if you use an AMD FX8320 or higher CPU.
Don't you see that's a problem? the higher power usage gets the bigger the power supply we need, that means more components which means more expense which means slower development as manufacturers wait for technology to improve to get more power.
It also means less heat in PC to deal with, that means less likelyhood of getting into the exact same situation R9 290 cards are ALREADY in (i.e. getting to a point where without watercooling or a 5 slot thick heastsink cant even run at reference clock speeds (i have a MSI R9 290 and even with being the aftermarket cooled version which i have undervolted it goes into thermal protection and drops to 900MHz)).
Also the power savings actually aren't even that miniscule, a £20 saving now.. yeah sure, how about in 5 years if that saving was £100 a year, surely you arent going to tell me three whole AAA games (if bought via G2A for example) is a "miniscule" gain?