Nvidia prepares to shrink G80 to make a G90

Mr. Popo

New member
Seeing the shrink

NVIDIA still has to launch a few G8x cards and is mulling an upgrade to its high-end offerings.

The firm would like to double up on G80 to a GX2 dual-chip version, if it can conquer the heat and power management problems that poses. After that it will be time for G9x flavor, codenamed G90, our senior industry sources tell us.

Those same sources say Nvidia needs a more cost-efficient way of producing G9x high-end chip based on the G80 marchitecture. Chances are that Nvidia is working with TSMC to produce the chips on a 65 nanometer process. The resultant shrunken chips would work with much higher frequencies, and powered with GDDR4. But we still have to confirm these details.

There are two challenges going down to 80 or 65 nanometers from the current 90. An optical shrink from 90 to 80 nanometer doesn't allow much space for clock speed increases. You cannot go much higher than 700+ MHz depending on the chip marchitecture. It

Going down to 65 nanometer is not an optical shrink and presents a major engineering challenge. That’s why it took AMD ages to go from 90 nanometer CPUs to 65. And graphics chips are even more challenging to shrink. Nvidia will go 65 nanometer with G78 and if this goes well it will probably try to move all its chip to 65nm in order to make more money per wafer.

If Nvidia makes G80 smaller there may also be just enough space for 512-bit controller. If all goes as planned, the chip is scheduled for second part of 2007. We will keep our ears open. Nvidia never comments out forward-looking stories.

Source
 
OK it's early and I'm tired so bear with me...

Currently the 8800's are on 90nm process, G80's.

The rumours on 8900's are that they are using the G80 with an 80nm process.

Now there is a new G90 in the pipeline on a 80/65nm process...

Does this mean there will be no 80nm G80? That the new 89's will be held back until this G90 is sorted?
 
I thikn whats hes saying is that they are currently testing the G78s with the 65 to see if it works. If it does and they can accomplish it they want to change it all over the GDDR4 and 65nm? If so, that would be freakin pwnage and R600 dosent stand a snowballs chance! *thinks to self* mmmm 65nm 8600ultra with GDDR4 at $200.
 
name='PP Mguire' said:
If so, that would be freakin pwnage and R600 dosent stand a snowballs chance!

Every tme someone says something like this it makes me lol

No Rly...AMD and ATI will produce some nice parts, just have to wait a bit. Last gen nVidia had the crown till X1800...then X1900 came and stole the shine
 
Thats becasue ATI where panzies and waited to see what Nvidia had to offer lol. Still though, 7950GX2, cant really compare to it. I see AMD has stolen that idea as well.
 
name='PP Mguire' said:
Thats becasue ATI where panzies and waited to see what Nvidia had to offer lol. Still though, 7950GX2, cant really compare to it. I see AMD has stolen that idea as well.

The 7950GX2 was a poor attempt at gaining back the top-end from ATI. It's a hashed together card imo with worse image quality than the ATI equivalent

ATI/AMD are looking to do real dual-gpu cards...which will be nice :)
 
Idk about you, but ive seen lots of dual GPU cards already that wernt worth crap *pardon me, no sleep im cranky. Will go to bed soon promise* For instance the Gigabyte dual 6800GT. Wtf was that all about? The Voodoo5 5500 (ahead of its time) kinda pwned but still not really much of anything. You really think they will make a "real" dual GPU card this time?

K goin to bed. :sleep:
 
name='Kempez' said:
..

ATI/AMD are looking to do real dual-gpu cards...which will be nice :)

I personally think these guyz are going to come up with something extra special in the not too distant future. Something that may incorporate a mobo advantage.

Just a matter of time imo.

Aren`t the likes of 7950, 8950, x950 and such just sorta stop gap, expensive, chuck in more memory, clock it, `push the generation as far as we can go at all costs` b4 the next gen is a reality ?
 
Back
Top