NVIDIA to launch its GeForce GTX 880 next month, at under $500

man some of you are way too optimistic. if its faster than the 780Ti, it may well cost $800. This is nvidia. My guess is it replaces the 780 vanilla and beats it just enough to "justify" a $100 premium over the r9 290
 
Yep 28nm, We won't be seeing 20nm until the full fat Maxwell cards probably :(


There won't be any 20nm cards. At all. NVIDIA will likely be skipping it altogether after TSMC said they couldn't guarantee high enough yields at the specifications that NVIDIA were demanding.

What is likely to happen is that the Maxwell refresh will step down from the current 28nm to 16nm FINFET, a move that TSMC claims is likely to lead to the lower production cost that NVIDIA want, as well as much higher yields.
 
There won't be any 20nm cards. At all. NVIDIA will likely be skipping it altogether after TSMC said they couldn't guarantee high enough yields at the specifications that NVIDIA were demanding.

What is likely to happen is that the Maxwell refresh will step down from the current 28nm to 16nm FINFET, a move that TSMC claims is likely to lead to the lower production cost that NVIDIA want, as well as much higher yields.


Much lower costs ? If that is officially made public by TSMC then Nvidia are going to have to start lowering prices to more "Sane" levels otherwise people are going to start voting with their wallets, I hope so anyway.
 
Much lower costs ? If that is officially made public by TSMC then Nvidia are going to have to start lowering prices to more "Sane" levels otherwise people are going to start voting with their wallets, I hope so anyway.

Much lower compared to 20nm, yes. That is almost entirely due to the fact that yields for 20nm have so far apparently been pretty awful. With 16nm FINFET the yields be higher, meaning less wastage which in turn means less cost.

NVIDIA were quite clear that they want both high compute ability from the new chips, but also lower power requirements. Both TSMC and Global Foundries said that they could do one or the other, but not both, at least not in yield numbers that would make it even remotely profitable.

So yes, the 16nm will be significantly cheaper than moving to 20nm, and eventually it should prove to be significantly cheaper than the current 28nm once the manufacturing process matures.
 
Last edited:
Much lower compared to 20nm, yes. That is almost entirely due to the fact that yields for 20nm have so far apparently been pretty awful. With 16nm FINFET not only will the yields be higher, meaning less wastage which in turn means less cost.

NVIDIA were quite clear that they want both high compute ability from the new chips, but also lower power requirements. Both TSMC and Global Foundries said that they could do one or the other, but not both, at least not in yield numbers that would make it even remotely profitable.

So yes, the 16nm will be significantly cheaper than moving to 20nm, and eventually it should prove to be significantly cheaper than the current 28nm once the manufacturing process matures.

Good news for all parties then, I do hope that this translates over to the end user, Would be good to see sub £500 top end GPU's again i.e The 8800 ultra was £485, If those type of prices returned for their highest end GeForce GPU, Sales would undoubtedly go up.
 
Wonder wth AMD is doing now.. all i hear these days is the 8xx series. Nothing on AMD.. if they wanna get ahead they need to start releasing closer to nvidia before they get more market share.
 
man some of you are way too optimistic. if its faster than the 780Ti, it may well cost $800. This is nvidia. My guess is it replaces the 780 vanilla and beats it just enough to "justify" a $100 premium over the r9 290

The R9 290X maybe cheaper but realistically the bastard needs to be put underwater so it dosent the surrounding components in your rig lol so there is extra cost involved there thst you don't necessarily have to spend with a Nvidia.

Even the best after market coolers are still only getting the 290x down to around 70°c

Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.
 
Even the best after market coolers are still only getting the 290x down to around 70°c

Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.

Try Brisbane. If I even saw 80 in summer I would be ecstatic. My 580 was topping 90 at one stage so anything less than that is awesome in my book.
 
The R9 290X maybe cheaper but realistically the bastard needs to be put underwater so it dosent the surrounding components in your rig lol so there is extra cost involved there thst you don't necessarily have to spend with a Nvidia.

Even the best after market coolers are still only getting the 290x down to around 70°c

Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.

Then nvidia cards must be super hot to you. There target tempture is 80C out of the box without any tuning, and they do fine. So really the amd cards aren't bad at all in comparison. Besides 70C is more than likely a normal temp for cards to be at on average on air, they are well within their thermal limits. You can't bring up the heat issue the way you are representing your argument, if you mentioned stock coolers then yes you could. Aftermarket vs aftermarket however not so much.
 
They weren't shouting about the 780 either.
I seem to remember them (open to correction) talking about the 780 and the 700 series in general at Computex last year.

Even the best after market coolers are still only getting the 290x down to around 70°c

Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.
My 780 runs in the mid 70's while I'm gaming; it's normally around 74/75C that I'll see.

To be honest, the max temperature for most CPUs/GPUs is normally greater than 100C, and even then there's always thermal shutdown/throttling built into them so even in the event they actually do get too hot, they're actually able to handle that scenario too.

Honestly people get waay too excited about the temperatures of hardware. The manufactures have taken care of most of the problems already.
 
I seem to remember them (open to correction) talking about the 780 and the 700 series in general at Computex last year.


My 780 runs in the mid 70's while I'm gaming; it's normally around 74/75C that I'll see.

To be honest, the max temperature for most CPUs/GPUs is normally greater than 100C, and even then there's always thermal shutdown/throttling built into them so even in the event they actually do get too hot, they're actually able to handle that scenario too.

Honestly people get waay too excited about the temperatures of hardware. The manufactures have taken care of most of the problems already.

temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.
 
temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.

The temp issues have really been settled with some of the better aftermarket coolers. Not as great as aftermarket cooled nvidia cards, but a tad better than nvidia reference cooled.
This is a great example for why AMD marketing people should be fired and forced to live under a bridge. The reference only release (i can't imagine any other reason than marketing) ruined the reputation of the card forever.
 
That looks like it might be legit, if that's the case you'll probably hear more from better sources soon enough.

I'm still skeptical though, I think it's unlikely whatever Nvidia is going to release will trump the 780Ti.

temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.
Sorry, but that's all wrong.

The power delivery circuitry provides a constant voltage irrespective of temperature it's operating at and irrespective of what temperature the GPU is sitting it. It's designed to do just that; if the voltage is to increase with temperature then there's a major problem.

Electromigration is definitely a problem in integrated circuits like GPUs and a bigger problem at higher temperatures, but normally the manufacturer will design them so they can operate at the max temperature and not have a problem with electromigration.

"Cooler" might be better, or at least one might think it's better, but "cooler" isn't as important as some would like to think. ;)
 
That looks like it might be legit, if that's the case you'll probably hear more from better sources soon enough.

I'm still skeptical though, I think it's unlikely whatever Nvidia is going to release will trump the 780Ti.


Sorry, but that's all wrong.

The power delivery circuitry provides a constant voltage irrespective of temperature it's operating at and irrespective of what temperature the GPU is sitting it. It's designed to do just that; if the voltage is to increase with temperature then there's a major problem.

Electromigration is definitely a problem in integrated circuits like GPUs and a bigger problem at higher temperatures, but normally the manufacturer will design them so they can operate at the max temperature and not have a problem with electromigration.

"Cooler" might be better, or at least one might think it's better, but "cooler" isn't as important as some would like to think. ;)

maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v

so thats the efficiency thing i was on about..

i also stated that this is not always true in my above post. but it is true a lot of the time.
 
Last edited:
temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.


the power consumption affected by gpu heat is very minimal
 
maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v


so thats the efficiency thing i was on about..

i also stated that this is not always true in my above post. but it is true a lot of the time.

Um have you ever overclocked before? Dropping the temps 20C has no effect on voltage. It only effects that.. temps. Don't know what your going on about here.
 
maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v


so thats the efficiency thing i was on about..

i also stated that this is not always true in my above post. but it is true a lot of the time.
umm that does not sound right. there should be no diffrence in the volts needed.
The efficiency is mostly determined by the architecture and manufacturing process
 
Last edited:
maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v

so thats the efficiency thing i was on about..

i also stated that this is not always true in my above post. but it is true a lot of the time.
Not really, temperature isn't going to affect stuff that way.

And if we're talking about temperatures for Intels stuff, they can run up to 105C without issue, or at least the Haswell ones can. Temperature really is a non-issue, if they go beyond that they throttle in order to cool down.

This is off-topic anyway, start a new thread if you want to discuss it more.
 
Back
Top