Nvidia 700 series price drop

Cause I'm in the same boat. But I guess "The best time to buy your GPU is a week after you actually bought it" works. Probably should have waited but I was tired of half built rig beside me (the old 5870 was showing her age.)
 
Last edited:
Wow for only £428 I could get a 3rd GTX780 SC.... it's almost tempting :o... but that would be stupid :huh:

JR
 
Actually, if anything the AMD GPU is more efficient than the corresponding nVidia GPU aboard the 780 & Titan, as they're both on the same process, and its die is like 20% or so smaller, but yet has similar performance.
Sadly though,
High Performance = High Heat Dissipation

If it relatively produces a lot of heat, it's inefficient. If you have a device with 100% efficiency, no heat is produced, it's as simple as that. They shouldn't have made a chip that produces so much heat if they can't cool it sufficiently.

High performance doesn't just mean high heat; you could have the best performing device in the world produce very little heat if it's efficient, or a poorly performing device that produces a lot of heat if it's inefficient.

If you have a device with a given efficiency, and you make it do more, then of course it produces proportionately more heat, but it still doesn't mean any high performance device produces more heat than any lower performing one.
 
Last edited:
If it relatively produces a lot of heat, it's inefficient. If you have a device with 100% efficiency, no heat is produced, it's as simple as that. They shouldn't have made a chip that produces so much heat if they can't cool it sufficiently.

High performance doesn't just mean high heat; you could have the best performing device in the world produce very little heat if it's efficient, or a poorly performing device that produces a lot of heat if it's inefficient.

If you have a device with a given efficiency, and you make it do more, then of course it produces proportionately more heat, but it still doesn't mean any high performance device produces more heat than any lower performing one.

Not true.. 100% efficiency would mean absolutely every single watt is used for 100% performance. Now in this case since AMD is trying to use every single watt -250 watt TDP- then that means A LOT of power consumption and heat. Heat being only an issue for the parts around the gpu but for the reference card it is all pushed out in the back anyway.
 
If it relatively produces a lot of heat, it's inefficient. If you have a device with 100% efficiency, no heat is produced, it's as simple as that. They shouldn't have made a chip that produces so much heat if they can't cool it sufficiently.

High performance doesn't just mean high heat; you could have the best performing device in the world produce very little heat if it's efficient, or a poorly performing device that produces a lot of heat if it's inefficient.

If you have a device with a given efficiency, and you make it do more, then of course it produces proportionately more heat, but it still doesn't mean any high performance device produces more heat than any lower performing one.
In the real world (ie. where 100% efficiency is impossible), a high performing device will always be power hungry, you're not getting a high performance device like the AMD GPU down to a 2W TDP, it's just not possible.

Just to break it down, the power disappation (heat) for a digital circuit is normally calculated via:

P = N * f * C * V^2

N is the number of circuits, f is the switching frequency, C is the load capacitance, and V is the voltage.

For a high performance device, there has to be lot of individual gates (read transistors), the GPUs we're talking about here have of the order of 7.3 billion transistors. Also you're going to need a high clock; the Titan isn't going to perform well if you clock it at 1MHz now, is it? And of course, high clocks require higher voltages.

Put it all together for the above equation and the number you get isn't going to be small. This is the reason your GPU probably has a 250W or so TDP.

So yeah, High Performance = Heat
 
Not true.. 100% efficiency would mean absolutely every single watt is used for 100% performance. Now in this case since AMD is trying to use every single watt -250 watt TDP- then that means A LOT of power consumption and heat. Heat being only an issue for the parts around the gpu but for the reference card it is all pushed out in the back anyway.
Sorry but sounds like your trying to defend the nvidia 480
 

I just think you're missing my point still. I wasn't saying you can have 100% it's just an idealised model to show how the heat production acts in a high efficiency limit.

All I'm saying is, if you have two devices, the higher performing one does not HAVE to be hotter, as long as it's more efficient than the other.
 
One more thing and I'll shut up, since we'll only end up pulling the thread off-topic.

Not true.. 100% efficiency would mean absolutely every single watt is used for 100% performance.

I just think you're missing my point still. I wasn't saying you can have 100% it's just an idealised model to show how the heat production acts in a high efficiency limit.
You both seem to applying the concept of power efficiency to a GPU. That doesn't really work as a circuit like a CPU/GPU doesn't have a power input or output per se. Sure, it produces heat, and in an ideal world it wouldn't produce any, but it doesn't really have any power output, so the "efficiency" concept in terms of power doesn't hold up.

Contrast that to your PSU, it pulls something (an input) from the wall, and delivers an output to the internal pc components. The difference is what it burns internally, thus leading to the efficiency measure, ie. 80% efficient means 80% of the input gets delivered to the output, 20% getting lost within. Here the efficiency figure actually makes sense, whereas conceptually, a CPU/GPU is very different. Performance per watt dissipated in heat or Performance Vs Die Area are the better measures, and are normally what are used.

See, this is where you get into electronics, and most users like those on this forum and elsewhere are really just pc users/enthusiasts with above average knowledge of pc hardware but not necessarily electronics, where in reality an intimate knowledge of electronics is involved to understand this kind of stuff. Occasional PSU reviews, or when people get talking about the process technology are great examples of this.

/pet peeve of mine.

All I'm saying is, if you have two devices, the higher performing one does not HAVE to be hotter, as long as it's more efficient than the other.
It really depends on the performance difference. There's a big gap between a 2W Smartphone CPU/GPU and a high-end desktop one both in terms of power draw/heat dissipation and performance.

More On-Topic:
There are some GTX 780's that are quite attractive right now, citing a place I buy components off (no 290X's available yet)
http://www.dabs.ie/products/best-va...eam-8Z78.html?refs=4294945034-492850000&src=3

Bring on the price cuts, I might pick up a 780 over Christmas. :)
 
Last edited:
In the real world (ie. where 100% efficiency is impossible), a high performing device will always be power hungry, you're not getting a high performance device like the AMD GPU down to a 2W TDP, it's just not possible.

Just to break it down, the power disappation (heat) for a digital circuit is normally calculated via:

P = N * f * C * V^2

N is the number of circuits, f is the switching frequency, C is the load capacitance, and V is the voltage.

For a high performance device, there has to be lot of individual gates (read transistors), the GPUs we're talking about here have of the order of 7.3 billion transistors. Also you're going to need a high clock; the Titan isn't going to perform well if you clock it at 1MHz now, is it? And of course, high clocks require higher voltages.

Put it all together for the above equation and the number you get isn't going to be small. This is the reason your GPU probably has a 250W or so TDP.

So yeah, High Performance = Heat

I could ask my father about all these theories and such since he is an electrical engineer. I'll ask him about it i guess. Not much i can argue against your post since you seem to obviously know more than me on that subject:p

Sorry but sounds like your trying to defend the nvidia 480

Um. Where did that even come from? I did not even mention nvidia?
 
780

A 780 is about £400 currently. I don't think i'll be getting a 780ti as £400 is about the max i will pay for a card.

Do you think the 780 will drop more in price over xmas in the uk?

Regards, Tom.
 
A 780 is about £400 currently. I don't think i'll be getting a 780ti as £400 is about the max i will pay for a card.

Do you think the 780 will drop more in price over xmas in the uk?

Regards, Tom.

I pulled your post in here because it didn't need it's own topic, there are enough about it already dude.
 
Back
Top