Yep 28nm, We won't be seeing 20nm until the full fat Maxwell cards probably![]()
There won't be any 20nm cards. At all. NVIDIA will likely be skipping it altogether after TSMC said they couldn't guarantee high enough yields at the specifications that NVIDIA were demanding.
What is likely to happen is that the Maxwell refresh will step down from the current 28nm to 16nm FINFET, a move that TSMC claims is likely to lead to the lower production cost that NVIDIA want, as well as much higher yields.
Much lower costs ? If that is officially made public by TSMC then Nvidia are going to have to start lowering prices to more "Sane" levels otherwise people are going to start voting with their wallets, I hope so anyway.
Much lower compared to 20nm, yes. That is almost entirely due to the fact that yields for 20nm have so far apparently been pretty awful. With 16nm FINFET not only will the yields be higher, meaning less wastage which in turn means less cost.
NVIDIA were quite clear that they want both high compute ability from the new chips, but also lower power requirements. Both TSMC and Global Foundries said that they could do one or the other, but not both, at least not in yield numbers that would make it even remotely profitable.
So yes, the 16nm will be significantly cheaper than moving to 20nm, and eventually it should prove to be significantly cheaper than the current 28nm once the manufacturing process matures.
man some of you are way too optimistic. if its faster than the 780Ti, it may well cost $800. This is nvidia. My guess is it replaces the 780 vanilla and beats it just enough to "justify" a $100 premium over the r9 290
Even the best after market coolers are still only getting the 290x down to around 70°c
Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.
The R9 290X maybe cheaper but realistically the bastard needs to be put underwater so it dosent the surrounding components in your rig lol so there is extra cost involved there thst you don't necessarily have to spend with a Nvidia.
Even the best after market coolers are still only getting the 290x down to around 70°c
Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.
I seem to remember them (open to correction) talking about the 780 and the 700 series in general at Computex last year.They weren't shouting about the 780 either.
My 780 runs in the mid 70's while I'm gaming; it's normally around 74/75C that I'll see.Even the best after market coolers are still only getting the 290x down to around 70°c
Still to hot I'm my opinion. My rig would melt through the floor I were running one on air during summer down here.
I seem to remember them (open to correction) talking about the 780 and the 700 series in general at Computex last year.
My 780 runs in the mid 70's while I'm gaming; it's normally around 74/75C that I'll see.
To be honest, the max temperature for most CPUs/GPUs is normally greater than 100C, and even then there's always thermal shutdown/throttling built into them so even in the event they actually do get too hot, they're actually able to handle that scenario too.
Honestly people get waay too excited about the temperatures of hardware. The manufactures have taken care of most of the problems already.
temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.
This is a great example for why AMD marketing people should be fired and forced to live under a bridge.
That looks like it might be legit, if that's the case you'll probably hear more from better sources soon enough.
Sorry, but that's all wrong.temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.
That looks like it might be legit, if that's the case you'll probably hear more from better sources soon enough.
I'm still skeptical though, I think it's unlikely whatever Nvidia is going to release will trump the 780Ti.
Sorry, but that's all wrong.
The power delivery circuitry provides a constant voltage irrespective of temperature it's operating at and irrespective of what temperature the GPU is sitting it. It's designed to do just that; if the voltage is to increase with temperature then there's a major problem.
Electromigration is definitely a problem in integrated circuits like GPUs and a bigger problem at higher temperatures, but normally the manufacturer will design them so they can operate at the max temperature and not have a problem with electromigration.
"Cooler" might be better, or at least one might think it's better, but "cooler" isn't as important as some would like to think.![]()
temperature affect efficiency though. a cooler gpu/cpu will (usually not always) be able to run at a higher over clock with less voltages. the hotter it gets the more voltage it uses. so cooler is better. also electromigration happens easier at higher temps. so you can increase the life of your components by keeping them cooler not only by running them at lower voltages, but by reducing the process of elecromigration.
maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v
so thats the efficiency thing i was on about..
i also stated that this is not always true in my above post. but it is true a lot of the time.
umm that does not sound right. there should be no diffrence in the volts needed.maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v
so thats the efficiency thing i was on about..
i also stated that this is not always true in my above post. but it is true a lot of the time.
Not really, temperature isn't going to affect stuff that way.maybe i worded it wrong. (pretty sure i didnt)
the efficiency of a cpu can be correlated to a temperature its running at in a number of situations.
as an exampe (just some random numbers here not to be taken as actaull occurances. just an example)
you have a i7-3770k and with stock cooling you can run a oc of 4.5 at 1.35v
you make a custom loop that drops the temps by 20c
you find that you can now achieve the same 4.5 oc at 1.25v
so thats the efficiency thing i was on about..
i also stated that this is not always true in my above post. but it is true a lot of the time.