Nvidia's Fermi Core Revealed

I'm personally setting my expectation on a price tag similar to how much the 8800 GTX/Ultra/768 cost way back when they first came out. What were they ? £400/£500 ?

Sounded extortionate at the time, but reflecting on the number of people who had bought them way back when and have sensibly seen no reason to upgrade since (for the want of tiny %age performance increases that u don't notice in-game), they would have spent less cash than people who buy cards every time one is release that does 1% more in crysis each time.

Expecting a GTX380 to be ~£500, which is an opinion and not a fact, I think won't lead to a nasty surprise when it comes out. Only reason I think the price is going to be that high is that these are gfxcards that we've not had even a similar concept of b4. The 1st cpugpu. Reminding me of the 7 series to the 8 series.
 
Is there going to be a dual card ?

I don't remember a 8950GT card - what I'm thinking there is that with the straight 280/285 being too much of an issue to make 295 out of, if the 'single' GTX380 is that good, they may not bother.
 
name='Rastalovich' said:
Is there going to be a dual card ?

I don't remember a 8950GT card - what I'm thinking there is that with the straight 280/285 being too much of an issue to make 295 out of, if the 'single' GTX380 is that good, they may not bother.

the only thing in comparison was a 9800gx2 rast!
 
name='Rastalovich' said:
Is there going to be a dual card ?

I don't remember a 8950GT card - what I'm thinking there is that with the straight 280/285 being too much of an issue to make 295 out of, if the 'single' GTX380 is that good, they may not bother.

There will be a GX2 variant of the GT300 cards, yeah.
 
Yup yup, for shizzle. Already confirmed that there will be a 395 or GX2 or whatever ya wanna call it variety of this gen.

Likely to be a 380 with a couple of clusters disabled to get within a semi-acceptable TDP.

name='Rastalovich' said:
Is there going to be a dual card ?

I don't remember a 8950GT card - what I'm thinking there is that with the straight 280/285 being too much of an issue to make 295 out of, if the 'single' GTX380 is that good, they may not bother.
 
name='Bungral' said:
Yup yup, for shizzle. Already confirmed that there will be a 395 or GX2 or whatever ya wanna call it variety of this gen.

Likely to be a 380 with a couple of clusters disabled to get within a semi-acceptable TDP.

Might have been a fake tho ?
 
when are they due for relese i mean all of the range,i think ide be getting the below top version so like the 5870 but nvidia
 
Official word still just says, and always has said, Q1'10. Which could be anything up to end of March. Only driver signs I've personally seen are GTX380 & 360, and some laptop shenanigans, so I wouldn't expect more than that.
 
Yes, it looks like a late March launch with a near top-end class product (a 448 SP model, if Charlie has any speck of credibility left in him).
 
name='Sihastru' said:
Yes, it looks like a late March launch with a near top-end class product (a 448 SP model, if Charlie has any speck of credibility left in him).

Recycled info that was already on the net. I'd neither credit it or dismiss it, but I'd certainly not use that same name within any accreditation.
 
Dual GPU will not be possible, if we take into account the published TDP for the Fermi based Teslas. TMSC needs a little time to work a bit of their magic into the 40nm process before dual GPU for Fermi will be possible.

Also they might not need it. If the single GPU top card will perform close to the 5970 (I know it has a small reserve of clockspeed), the dual-GPU is going to be a hard sell. Considering all developers flee towards consoles, and no games will require so much GPU processing anymore, ofcourse.
 
name='Rastalovich' said:
Recycled info that was already on the net. I'd neither credit it or dismiss it, but I'd certainly not use that same name within any accreditation.

Yes, I agree, since 64 shaders will not make that much of a difference power consumption wise when you consider the size of the chip. The rumored reason might be incorrect, but maybe they have another reason for it. Maybe the card is so good, they don't need all the shaders enabled to beat AMD's top dog.
 
name='Sihastru' said:
Dual GPU will not be possible, if we take into account the published TDP for the Fermi based Teslas. TMSC needs a little time to work a bit of their magic into the 40nm process before dual GPU for Fermi will be possible.

The usual trend, since '08, is to base the dual gpu on the gpu under the very top one. Laced into an interface, pcb, that is good enough for the top card. However, Fermi is a new architecture to which none of this could apply, as we don't know.

As to TSMC, all the news ever published within the tech community is based on nothing, and has been fedback by the tsmc representatives as nonsense. (mostly sourced from TI & SA or SI) In the financial sector, they have presented nothing but good news, news about the 23nm process and the massive cash they're making.

name='Sihastru' said:
Also they might not need it. If the single GPU top card will perform close to the 5970 (I know it has a small reserve of clockspeed), the dual-GPU is going to be a hard sell. Considering all developers flee towards consoles, and no games will require so much GPU processing anymore, ofcourse.

It's arguable that the majority of gaming needs nothing more than a 4g GTX285. But dev moves on, card manufacturers, or 1 of them atleast, have been working with these devs which will bring us a number of dx11 releases. Know how poorly games are both written and optimized, I can see why they want to go faster and faster.

The dual gpus are meant to offer that much more than gaming also. Details of these kinda things are also still rumors and promises.

The Fermi is based on an architecture that, to this point, doesn't exist. It can't be compared to any existing gpu situation other than end result. GPU power in terms of gaming, probably needs to get faster to compensate for poor writing or game-engines.

I would say this tho, the emphasis in 2010 is probably more involved around quality as opposed to playing CoD @ 150 fps compared to 130 fps - which has been a nonsense for a few years now.
 
Back
Top