Fermi underpowered?

VonBlade

New member
Semiaccurate are reporting what we've all feared. That not only is Fermi delayed, but wont get anywhere near the anticipated performance levels.

So if anyone questioned the wisdom of our doubts about todays Tesla demonstration, half-eaten humble pies will be accepted at the usual address.

Read more here
 
5800 or FERMI

Hi Guys

Like Tom Logan, I am a fan of Red so I opted to get the 5850 and guys am I impressed. Not sure what the green team are going to do about the lost ground to ATI especially if the performance doesn't match.
 
im just going to wait and see, then consider my options, but hey, this always happens.

Intel brought out the monsterous 45nm cpus...and look now...theyve completely gone mental bringout out i7, then i5 and looking at i3 iirc?

Where as amd have opted out for a more subtle approach and doing more work on the AM3 socket (that right or am i wrong? )

Its going to be interesting to see how this develops, everything fails at one time, and people generally learn from their mistakes...

Anyways. im sitting on the fence and saving my cash until i see some competition, no point betting on a one sided fight now is there? it would be interesting...
 
Working Fermi...

first_working_gf100.jpg


Link
 
I reckon there's a HD5970 on the mobo on the above shelf and the card in the fore-front is a GTX280 with a custom cooler celotaped over it.

Joking aside, if that's an engineer, it bewilders me how some engineers 'work' in such a mess. Some engineers around here do too, which is why I put 'work' in quotes. I believe in a mess whilst ur doing stuff, but it looks obvious to me that at the last minute one mobo has been thrown to the top shelf and the newest one thrown into the space it's left - and plug everything in quick !
 
You really do crack me up Rast, everywhere is stating they are having problems yet you still try and argue it. If it works it will be great but even Nvidia HAVE stated its a massive project and they really needed longer to get it right.

I just hope this isnt going to be another massive recall / rma issue as it will cripple them.

They HAVE and you know it been trying to stay in the press with BS as they know the red team are running away from them 'ATM'.

Meh Ill look forward to the GT380 landing on my doorstep anyways.
 
name='tinytomlogan' said:
You really do crack me up Rast, everywhere is stating they are having problems yet you still try and argue it..

If u want to rely on made up news, then u r ofc well entitled to go along with it.

BUT, quoting the likes of SemiAccurate (hence the name) as a source for information, I'd rather ask my m8s dog to see what she thinks.

ALL the relating articles - including it saddens me to say OC3D - will be linkage to those very articles. To consider ALL these news references with nothing other than "our multiple inside sources" - which inform u of the facts such as "2% yield", which was a complete joke, makes urself look foolish.

There was a point in time where OC3D would-not allow quotes from certain sources within the newsy section. If that has changed, and OC3D does consider itself as 'just another rumor monger' then fair enough. It doesn't however sound like good news for the site from my pov.

Now, if u have a rumor section and not news, fair comment also.

The subject in itself doesn't mean much to me in this context. What irks me is that the News section isn't what it should be. Today I've seen the first official date for the release - ever, and yet we carry on with the delays theory with no substance.

All this ATI fanboism or anti-nVidiaism, doesn't look very professional.
 
I imagine the Green card will be worth the wait. While I wait I'll get a Red card!

Edit: When I did the news we never used Fud, Inq or anything similar to that. I think you should have a black list. Obviously I just posted Fud but there was a picture of what appears to be a working card from Nvidias Facebook. Some of these rumours are legit.
 
Nearly every single part of the news is sourced from official companies. However, when it's a topic that is hot on the lips of everyone then it would be foolish to not report it.

At no point are we pretending this is coming from us, or that it's 100% true. It clearly states within the article where the source is, who is saying what, and leaves you to make your own mind up.

You are however about the only person on the planet who didn't expect to see the GT300 this year. It's not ATI Fanboism, nor anti NVIDIA. We have gone to great lengths to mention that we would be freaking delighted if NVIDIA pull it out the bag.

However, the facts are that the Fermi core has significantly missed its core-speed target. Those figures come from NVIDIA themselves, as linked to in the article.

The facts are that Fermi has been demonstrated twice in a manner than isn't above board.

The facts are that ATI have, available to buy, a DirectX 11 graphics card that is the fastest thing on the planet.

Quite how you can misread all this is beyond me.
 
name='VonBlade' said:
However, the facts are that the Fermi core has significantly missed its core-speed target. Those figures come from NVIDIA themselves, as linked to in the article.

Missed the official link for that. Read it again and missed it. Saw the SemiAccurate guestimate. Based also on cross-architecture references that, if the person thinking they knew what they were talking about, would know is like comparing an AMD 1Ghz and an Intel or IBM 1Ghz cpu.

name='VonBlade' said:
The facts are that ATI have, available to buy, a DirectX 11 graphics card that is the fastest thing on the planet.

Which is fine, and has what to do with Fermi ? U can go on to say that they're hard to find and the prices are jacked accordingly and there's nothing Dx11 to play till possibly Jan next year.

name='VonBlade' said:
The facts are that Fermi has been demonstrated twice in a manner than isn't above board.

To even go along with that is foolish within itself.
 
name='Rastalovich' said:
Missed the official link for that. Read it again and missed it. Saw the SemiAccurate guestimate.

The clue will be the part of the article that says "NVIDIA themselves have confirmed that the expected release date is in the first quarter of 2010. Read their press release here". And shock horror, if you read it you see where NVIDIA say it does "performance in the range of 520GFlops - 630 GFlops".

It's amazing this reading lark. You read, you learn stuff, you don't come across looking like a nubbin.

Which is fine, and has what to do with Fermi ? U can go on to say that they're hard to find and the prices are jacked accordingly and there's nothing Dx11 to play till possibly Jan next year.

Are you seriously suggesting that the main competition for this exact graphics card has absolutely no bearing upon it? Really?
 
This is never going to be resolved, Nvid are having problems getting this out nuff said.

But green is the right colour for you I think Rast, in more ways than one. Lets leave it till the release of the 300's and let costs and benches settle the aguments.
 
name='VonBlade' said:
The clue will be the part of the article that says "NVIDIA themselves have confirmed that the expected release date is in the first quarter of 2010. Read their press release here". And shock horror, if you read it you see where NVIDIA say it does "performance in the range of 520GFlops - 630 GFlops".

Which is 1stly the 1st date ever published anywhere from a bonefide official source. And secondly the 520 to 630, yeah great - and they compare them to what ?

Stating the facts as they are is one thing. Glorifying the facts as compared to something some1 is latching on as official figures from architecture to pcb end result is something else.

name='VonBlade' said:
It's amazing this reading lark. You read, you learn stuff, you don't come across looking like a nubbin.

What's amazing is when people read things, from whatever source, and consider them fact without asking questions. That would definitely be both foolish and nubbin-esq.

name='tinytomlogan' said:
This is never going to be resolved, Nvid are having problems getting this out nuff said.

But green is the right colour for you I think Rast, in more ways than one. Lets leave it till the release of the 300's and let costs and benches settle the aguments.

Which is where ur missing the point completely.

U feel much better to say "Yeah Rast is Green, we can see that" - my choice of cards both now and in the future have no relevance here or anywhere else. If I were to chose to accept a GTX380 as a next card, u certainly won't see me shouting from the hills that it wipes the floor with everything else and blah blah blah ur all silly not to have one.

What ur missing is - NEWS - FACTS. They aren't based on what some1 posted somewhere, where they heard some1 else hand linked to something, that must be right as I've seen it linked to somewhere else.

Green/red camps mean nothing to me, but if u feel better saying I'm green, then fine.
 
Ok, well the cats out the bag now Rafa!!

Now I know why Liverpool have gone down the pan. You've been spending all your time on OC3D! :D:D

name='VonBlade' said:
However, the facts are that the Fermi core has significantly missed its core-speed target. Those figures come from NVIDIA themselves, as linked to in the article.

The facts are that Fermi has been demonstrated twice in a manner than isn't above board.

The facts are that ATI have, available to buy, a DirectX 11 graphics card that is the fastest thing on the planet.
 
Rasta is right, there are NO FACTS about Fermi, it's all just rumours from Charlie and his gang who are notorious for twisting what little facts or rumours there are to make it appear true.

I don't see how the flops prove how the core speeds aren't what they were stated, the original figure that they are referring to was only an estimate anyway?
 
I have to appologize to anybody who may think that I'm running off into an anti-ATI pro-nVidia tirade. I can see how and why it may look like that on this and other occasions, but the only FACT that I have at hand is that the likes of people who make up news (about anything), even making a living off of it, rubs me up the wrong way.

To then see that 'news' linked to on a repetitive basis - so much so that, all of a sudden it becomes fact in itself in so many people's eyes, that they both digest it and re-refer to it.

That CEO nVidia guy noted at the release of the Q3'09 results, that the 3 Fermi cards would not have any basis with those figures as they would be sometime in 2010 released. Which afa I can see got translated into they're being delayed til Q1'10 cos they're having problems. Linked to ad-nausia and ofc it's now fact.

Delays, and creative news there-of, is repeatedly being put down to 1st nVidia having a fabrication problem (which they don't), but the actually fabricating people TSMC having problems making 40nm wafers. Which is a 'terrible thing for nVidia', and just a 'slight issue for AMD'. I don't buy this what-so-ever. Only really cos they're pumping out 28nm options for Ti, already pushing 40nm for AMD cos.. well there are some releases already.., and they're punched out 40nm stuff for nVidia's newest GT240 that came out last week (but hasn't been reported on in many places, nor OC3D - this is a physical release). Something else is going on, and I don't know what.

Game devs will actually have the gf100 cards, albeit in whatever state they're supplied in. Package, drivers, whatever. It is in my opinion (and not a FACT) that as nVidia send teams out to work with these devs, that the Jan/Feb release of many of these Dx11 titles may well be co-inciding with the release of the cards. Possibly even so much as a packaged offering as so many of the previous nVidia flag-ships have done. They have a department dedicated and operating at a huge loss, that go out into the field, not just to squeeze in the "Way it's meant to be played" logo.

It's also my partial belief (and not fact) that in todays society where cash rules everything, that even if it is possible that a supposed GTX380 spanks all living cards by massive %ages, that they would curtail it to just beat the present champion by 10/20% at most. Possibly leaving the die to then be expanded by further &ages as the competition's next champion comes out. It's a sad thing to come to terms with but cash>tech these days. Ofc the productivity professional cards wouldn't reflect it as they are both 1000s of pounds more expensive and carry different driversets - never to see gaming at all, and what competition do they have anyway, it's a niche market.

AMD>nVidia or nVidia>AMD - I couldn't care less. I use both for whatever and I happen to previously have a GTX280, b4 that a cheap 8800GT and b4 that a 7600GT iirc. Without both of the next generation cards on the table, I consider the argument every1 seems to have of 1 vS the other irrelevant. I'd certainly not chose myself, on what I see as a big investment as I prefer these days to buy the top-end card rather than seek to upd8 on a regular basis. Fanboism drives u nuts, if u like physX apps, then there's something wrong with u, and if u want a rig that will use cuda apps aswell as game, then ur an idiot for some reason. And god help u if u don't swallow made up stories, ur all of a sudden anti whatever camp or a fanboi of whatever the made up story is about.

Please, when u see a piece of 'news', ask urself and the article questions as u read it. "A source" or "our inside sources" - don't mean a damn thing. SemiAccurate, by name itself, should tell u something. Just as the 'newspapers' link a football player with EVERY football team that is in the market to buy, with no real idea of who is actually talking to them. They mention every team possible, and ofc they'll get 1/100 correct - but don't take that as them knowing what the eff they're talking about from that point onwards.

And on that footballing note: Bottom of Rafa's prepared speech, it says point.10 - Aquilani :p (dunno if that actually meant 10 points until he's fit)

EDIT: Long post - bite me !
 
Tbh, it quite clearly states that the 520-630 value is for the single GPU version, and the dual GPU version will crank out around four times that.

I'll just wait and see what happens, it's not as if people are bringing out games that are more hardware intensive than Crysis. If I can run that on a decent FPS, WTF does it matter? Just wait and find out like everyone else.
 
Back
Top