Nvidia Are Confident About GT300

And a 1.7% yield means what exactly?

I just hope that they've learnt from the spanking ATI gave them with the 4xx0 series that price is a huge factor. No-one really cares about CUDA and all that guff. Just give us insane performance, including great AA performance, and not for the typical Nvidia point.

After all, if it comes in at 400 or even 350, it better be a massive improvement over a GTX295 or 4870X2 to justify the price.

Hmmm.
 
its nice to see them not rebranding the lesser performing parts. but i still think ima wait for a shiney new 5890 :yumyum:
 
I'm still yet to see any spanking to be fair. The rebranding, of which there is no real negative effect outside of irrational scare mongering (often from the TheInquirer camp).

It still remains that if u want the quickest, including the quality, u'll get an nVidia.

If ur not bothered with the quality, and crave fps only, u can get an AMD.

Budgets permitting ofc.

The yield is how successfull a return u get from a batch. If u like, if u burnt 100 cds, 1.7% of them would work.

I find this pretty hard to believe - except that I'm told that 1.7% isn't that far from the norm. As implied, if u make a batch of x4 cpus, anything up to 5% of the batch will pass a test. What u can do is use those that fail that 5% and retest them under lesser conditions and get another small % u can use as a lesser cpu, like an x3 or x2.

The criteria for passing these tests are silly-strict tho.
 
Edit: Damn, Rasta beat me to it! I've gotta stop writing magazine articles...

name='VonBlade' said:
And a 1.7% yield means what exactly?

I just hope that they've learnt from the spanking ATI gave them with the 4xx0 series that price is a huge factor. No-one really cares about CUDA and all that guff. Just give us insane performance, including great AA performance, and not for the typical Nvidia point.

After all, if it comes in at 400 or even 350, it better be a massive improvement over a GTX295 or 4870X2 to justify the price.

Hmmm.

VB I'm not sure if that question was sarcastic or not, but I'll answer it anyway just in case anyone else doesn't understand chip yields. The yeild is basically the percentage of chips on a wafer that meet all of their specifications.

The more complicated a design is, the less margin for error or imperfections in the manufacturing process there is. Last time I checked (it's been a long time so I'm sorry if my figures are a bit off), TSMC manufacturing a 30cm wide wafer (round disc of silicon) of 40nm GPU's for nvidia or ATI costs around $300. The size of the gpu dictates how many can fit onto one wafer, if only 1.7% of those chips actually work to their full specifications, then nVidia will only get a few high end GPU's per wafer. This really drives GPU prices up!

Thankfully, of the chips that don't fully meet the specifications, a lot of them can be reconfigured as lower specced parts by disabling various parts of the core. This is how we got the GTX260, and looking back into the past all of the chipmakers have been doing this as standard practice to recoup high end losses in the more competetive mid-range sector. Look at AMD's Phenom X3 and Radeon 4850/4830 as perfect examples of this.

This manufacturing process is constantly being tweaked and improved while the parts are in production. This is why later into a chips life, the yeilds improve, quality (overclocking ability) and prices come down. This is how nvidia managed to rename the 8800 core 3 or 4 times, although that was taking it a little bit too far!

I hope this answer was comprehensive enough, These cards are going to cost far too much. New technology always does but hopefully now you have an explaination as to some of the reason why.
 
As rasta said, the yield is how many out of a certain "batch" work as intended.

7 out of a batch of 416 worked to the standard needed for them to be put into retail units.
 
Many thanks guys. It wasn't a sarcastic question because I couldn't believe that only 1.7% actually worked as intended, so whilst I guessed that might be the answer it just made no sense.

Hells teeth that's an abysmal success rate. Just insane.

The cynic in me thinks that Nvidia are only announcing that information to excuse the obscene price it's gonna come in at.
 
To be fair, I doubt it. nVidia hardly talk to any1. It would be something if OC3D could or are on their press release thingy (if they even have one)

Or probably to be more realistic about it, they don't talk to many of the usual suspects - who frankly feed off one source who has/had his own personal chip on the shoulder apparently cos they never invited him to some *event*.

U can see how poorly nVidia are represented on the 'newsy' dedicated sites, as every single move they make is viewed as an epic failure and a mock on the community. Follow the sources and links and u generally end up at TheInquirer - the joke of a site that it is.

"nVidia now cut the grass around their dev building 6 times a month! (according to our sources on the inside that no1 else has) - every other manufacturer does it at the most 4 times - maybe this is a reason why their graphic cards are more expensive!"

.. it's about getting to that stage.

On the subject of cuts from batches passing - the criteria is generally so high that u'd consider it ridiculously high. It's typically a lab in white coats job, all manufacturers of chips go through it. Whether the nVidia 1.7% is high - who knows. If the 'newsy' people had any credentials, they'd post Ti's, AMD's, Motorola, even Intel - it's a bit of a non-comparison as they don't all cut the same cpu, but it'd give a perspective. Each will have a different level for different purposes. It also depends on what ur expectations are of the process. If u expect 1.5%, then ur doing well obviously. If u expect 2, maybe not.

U do realize ofc, if nVidia were using a few % above the 1.7% to do something, there would be a MASSIVE outcry, almost to the extent of foaming at the mouth, to say they're issuing sub-standard products, blah blah etc..

We should be aware that an x3 AMD is going to be a failed x4s. - where are the people crying into their keyboards about that ?

Absolute joke.
 
name='PeterStoba' said:
7 out of a batch of 416 worked to the standard needed for them to be put into retail units.

not quite, this is just the first spin of silicon.

in this instance, to meet the criteria of "working" just meant to have no process induced errors and that the silicon worked as the engineers expected. These are still far from being bug free.

it's still going to be a while before we see any retail quality parts.
 
name='Pyr0' said:
not quite, this is just the first spin of silicon.

in this instance, to meet the criteria of "working" just meant to have no process induced errors and that the silicon worked as the engineers expected. These are still far from being bug free.

it's still going to be a while before we see any retail quality parts.

Really? So the initial numbers could be even lower? :o
 
From Guru3d:

Recently a rumor on the web popped up that NVIDIA's yields on the new GT300 chip are lower than 2%

GT300 is the new high-end DX11 class GPU that will empower their upcoming graphics cards.

To put a leash on the rumor a senior manager from NVIDIA is now saying that “Our (NVIDIA's) 40nm yields are fine. The rumors you are hearing are baseless.”

The product manager unfortunately did not reveal what the GT300 yields really are.

NVIDIA also questioned DX11, which only enforces rumors about their delayed chipset, because trust me .when we say ... if NVIDIA had it's DX11 parts out, it would be marketing DX11 heavily.

Apparently the GT300 was taped out early September, add a week or six to eight to that and this would tell us a launch could be imminent as soon as late November early December.

Ah well, we'll wait and see, hopefully NVIDIA can surprise us all real soon.
 
Back
Top