NVIDIA to launch its GeForce GTX 880 next month, at under $500

Actually power consumption increases if temperatures are higher. Obviously it happens in same voltage and frequency condition ( no power management software involved )

It's because higher temperatures lead to higher electron mobility in semiconductors, i.e. higher conductivity

Example from TPU: http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html

That was a very inefficient architecture. Gtx 480 was a monster in terms of power consumption.

I cant believe this. im buying a power meter and testing this out for myself. I have a gtx 570 which is a bit more efficient version of the gtx 480, both use fermi architecture
 
Last edited:
That was a very inefficient architecture. Gtx 480 was a monster in terms of power consumption.

I cant believe this. im buying a power meter and testing this out for myself. I have a gtx 570 which is a bit more efficient than the gtx 480, both use fermi architecture

This is not related to architecture. Just physics.

It's a property of semiconductors and it's quite a bad thing
 
Errr... Conductivity actually decreases at higher temperatures. ;)

http://en.wikipedia.org/wiki/Semiconductor

http://en.wikipedia.org/wiki/Electrical_conductivity

Conductivity increases with lower temperatures in metal, not in semiconductors

More conductivity= more unwanted electrons passing through transistors, at least it's the most logic thing

I'm pretty sure I'm right

I dont think the difference in power consumption would be so big

It's actually quite big. In fact cryptocurrency miners do optimize their rigs to run at lower temperatures possibile, until the power spent to cool hardware is less than that saved thanks to lower temps

Thread on anandtech forum: http://forums.anandtech.com/showthread.php?t=2200205

p.s.: we're pretty OT here, if someone wants to delve deeper into this topic, just open a new thread ;)
 
Last edited:
http://en.wikipedia.org/wiki/Semiconductor

Conductivity increases with lower temperatures in metal, not in semiconductors

More conductivity= more unwanted electrons passing through transistors, at least it's the most logic thing

I'm pretty sure I'm right



It's actually quite big. In fact cryptocurrency miners do optimize their rigs to run at lower temperatures possibile, until the power spent to cool hardware is less than that saved thanks to lower temps

I've been into mining, but i have not heard of miners running their gpu's cooler to save on power costs (except for power supplies). they do it to keep the gpu components, (mostly vrm's) cool so the gpu lasts longer
 
I'm tempted to get a 880, all depends on how good it is, performance would have to be better than a 780 Ti.
AMD have to have something up their sleeves though and I don't like buying a new GPU until both Nvidia and AMD have new cards out so I can make sure i'm buying the best one for my money.
 
I'm tempted to get a 880, all depends on how good it is, performance would have to be better than a 780 Ti.
AMD have to have something up their sleeves though and I don't like buying a new GPU until both Nvidia and AMD have new cards out so I can make sure i'm buying the best one for my money.

That's what everyone should do
 
I'm tempted to get a 880, all depends on how good it is, performance would have to be better than a 780 Ti.
AMD have to have something up their sleeves though and I don't like buying a new GPU until both Nvidia and AMD have new cards out so I can make sure i'm buying the best one for my money.

I don't think an upgrade from a 290 to an 880 would make a whole lot of sense?

That's what everyone should do

Everyone has different standards, best bang for the buck isn't a top priority for everyone.
 
Everyone has different standards, best bang for the buck isn't a top priority for everyone.

I wasn't referring to that. It was directed to wait till both camps release their products before buying instead of buying the first newest thing to be released from one camp.
 
I don't think an upgrade from a 290 to an 880 would make a whole lot of sense?

Everyone has different standards, best bang for the buck isn't a top priority for everyone.

Probably not :p I would only buy one if it was actually faster than a 780Ti though, no point otherwise.
 
I wasn't referring to that. It was directed to wait till both camps release their products before buying instead of buying the first newest thing to be released from one camp.

Oh ye, you are right about that. Well, if AMD don't take more than half a year again to catch up.
 
When they do release this card I hope they don't do a stupid $/£ conversion rate like apple and it should be £300 uk price.
 
I'd like to upgrade around January to a single card solution that is at least 15% faster than the 780 Ti with increased VRAM.

So I hope something comes out around then :)

Yea, but really before they focus on freesync or SSD's the should make sure they are bringing a competition to the market with GPUS


AMD are spread out far too thinly in my opinion, They have taken on way too much, CPU's, GPU's, APU's, SSD's, Memory, Freesync.

It's too much, That's why intel and Nvidia will always be ahead because they can concentrate their efforts on 1 product type at a time were as AMD has a gazillion things going on at once.
 
Last edited:
When they do release this card I hope they don't do a stupid $/£ conversion rate like apple and it should be £300 uk price.

There is more to in than the conversion rate. People in different countries make different amounts of money, have to pay different amounts of money for their daily needs, have different economies, etc..
The US will always be cheaper than the UK, well, at least for the foreseeable future.

They have a lot of resources devoted elsewhere. So who knows?:huh:

Maybe that is a bad idea. Freesync shouldn't have the same priority as the next GPU family.
I won't buy on release date, but if they take more than 3 months i won't wait for them anymore. Well, i'm going to skip the 8xx series anyways.

AMD are spread out far too thinly in my opinion, They have taken on way too much, CPU's, GPU's, APU's, SSD's, Memory, Freesync.

It's too much, That's why intel and Nvidia will always be ahead because they can concentrate their efforts on 1 product type at a time were as AMD has a gazillion things going on at once.

This.
AMD wants to impress everywhere and the result is half arsed crap like the 290x. On release the 290x was just disappointing, not far ahead of the competition, beaten within 3 weeks, terrible cooler and far too late to the party. It's fixed now, but it definitely isn't what it was supposed to be or could've been if AMD had been focused on a single thing rather than the entire industry.
 
Last edited:
This.
AMD wants to impress everywhere and the result is half arsed crap like the 290x. On release the 290x was just disappointing, not far ahead of the competition, beaten within 3 weeks, terrible cooler and far too late to the party. It's fixed now, but it definitely isn't what it was supposed to be or could've been if AMD had been focused on a single thing rather than the entire industry.

Indeed, During the ATI days they were a true power house because all they did was GPU's, Now it just seems like there's no soul in ATI/AMD anymore and all they want is to have their fingers in as many pies as possible :(
 
Back
Top