OC3D Review: XFX 280GTX XXX Edition

Do 2 4870X2 out perform a pair of 280s in SLI? I heard that it scaled poorly in Xfire. Is the cooling on this xxx the same as normal, because I had very little difficulty on getting higher clocks, I was just worrying about the heat (2 cards = viable alternative to a heater)

I wonder when forceware 180 comes out and see whether the cards are better optimised then?
 
Get your quotes on!

name='webbo' said:
No offence intended m8 but did you actually read the review or just skim over it? The review was aimed more towards PhysX, something I don't feel has had enough exposure. /snip

No offence taken, I appreciate you put a lot of work into the review and it was an interesting spin to look at physx but I remember the big fuss 2 years ago about the physx card on this site, for it then to be revisited again a year later here and then for nvidia to take it on and add it in the 9800 series and now to see it again... It just isn't innovative if it has been done before on nvidia gpu's and there isn't anything innovative about this particular 280 which was my point. I don't want any animosity between a fellow :oc3d: member so in future I'll explain my points fully... :)

name='Rastalovich' said:
Think Mr. Smith may be leaning at with the PhysX aspect is.. well.. the 9800GTX will do the PhysX too, and so on.

%age of "good" games that come out and those games that utilize PhysX isn`t that big a margin. /snip

Bingo.

name='webbo' said:
If there is one thing NVidia are good at it's marketting. NVidia are invariably bigger than ATI and so will no doubt push the physX onto game developers. I honestly don't expect PhysX to fade like it did with AGEIA.

I'd like that to be true but my gut feeling is that if after 2 years physx isn't getting off the ground, will it ever?

name='Kempez' said:
I'm with Mr.Smith on thisone.

THIS I agree with, the review was excellent and it's great to see something different., nicely done.

BUT the actual product is no more innovative than anything else currently on the market - where's the XFX 280GTX XXX Edition's unique innovative feature above the all the rest of the GPU's out there?

QFT and I did say it was a decent review in the first place!

name='PV5150' said:
Nice review mate and I do appreciate the PhysX angle, however I have to agree with what has been said previously about the 'Innovation Award'. If you give it to this card, then you really ought to award it to all GTX 280's in order to be fair across the board.

QFT.

It was an interesting read and it has sparked debate and interest so I'dsay the review was a hit
 
I thought all cards had the same cooling solution, just wanted to make sure before putting the clocks sky and and suddenly smelling burning silicon
 
Ok fair one - maybe this card alone does not deserve the innovation award. But somewhere along the line NVidia should have been commended for incorporating PhysX into their cards no? Surely it would be unfair not to give any of the cards the award, much the same as it wouldn't be right to give them all the award as by now the 'innovation' should no longer be innovative. As you all rightly say, the 280 is at the end of a long line of cards capable, so it should really have been the very first review of a card by NV that was capable of PhysX.

Sadly, this is the first time PhysX has been explored on a GPU at OC3D due to it being missed in previous reviews and as such, the first time we have really become aware of the innovation, so it only seems right, that this is where the innovation award is awarded. It is no good burying our heads in the sand, denying that it isn't innovative as it has been done before. It may well have been done before but when did it get recognised? When was the award handed out previously?

Sorry guys, but the innovation, albeit delayed in being awarded, is here to stay.



They don't still stone people for handing out awards do they?
 
I think its a good card but doesn't add anything new because even a vanilla card can be OCed. It is a great review for exploring Physx, something of which I hope we see more of in the future.

With forceware 180, will we be able to use any 8,9 or 200 series for physics processing? If does it have to be SLIed because an albatron 8600GT PCI card looks good for that?
 
name='Mr. Smith' said:

LOL

name='Diablo' said:
I think its a good card but doesn't add anything new because even a vanilla card can be OCed. It is a great review for exploring Physx, something of which I hope we see more of in the future.

With forceware 180, will we be able to use any 8,9 or 200 series for physics processing? If does it have to be SLIed because an albatron 8600GT PCI card looks good for that?

The 'Big Bang 2' drivers are said to include the following :

* Multimonitor support for SLI

* Display Port support

* OpenGL 3.0

* Hardware video transcoding

* GPU PhysX support

* Performance optimizations

Whether the physX support is aimed at using, say, a low end card for PhysX only and a high end card for other GPU calculations I couldn't say but it would certainly be a neat way of utilising your old NV GPU's.

For those who want to give PhysX a try out, here is the latest driver:

http://www.nvidia.com/object/physx_8.09.04_whql.html
 
I'm in the rather annoying position of having a really nice sound card and not wanting to get rid of that in order to stick a third x16 card in. Hence the pure power of a PCI bus card (try saying that while eating)
 
name='webbo' said:
But somewhere along the line NVidia should have been commended for incorporating PhysX into their cards no?[/SIZE]

Is the addition of Physx not just a driver/software update? They did not own Ageia when the 8 series was been made yet it is now Physx capable.

ED
 
My mentality: Can something else be put on? If not, why not, if so...new parts here we come. This is mainly why the case is a rats nest of wire, has a 220mm fan on the outside and has one drive bay left...
 
name='k4p84' said:
Is the addition of Physx not just a driver/software update? They did not own Ageia when the 8 series was been made yet it is now Physx capable.

ED

Afaic, the initial release of the nVidia variant of the Aegia drivers was merely a recompile of the existing source.

They happened to use the gpu, they could equally have used the cpu, and what it looks like they`re point towards was a selector for the user to use whatever they chose.

The code must be either that good, or that well structured, that they were capable of adapting it in whichever fashion they wanted. This could be cos it was perhaps dev`d as a cross platform or simply x86 for giggles and final compile for the pci physx cards.

Either way, it was obviously easy for nVidia to cope with and manipulate to an extent. I feel they automatically discounted the cpu calculator in preference to gpu/physx-pci-card as the gains are that much greater.

Get some gaming companies rolled in tbh !!
 
I reckon that as Physx was coded in C, the PCI card could handle that, and as the 8 series onwards have got CUDA (C with a bit of adaptation) it wouldn't be too difficult to use the graphics chips in the way described above
 
Sorry for the double post, but:

What were the temperatures for the 280GTX XXX? I have an Asus card (air cooled) that is OCed to about 645MHz and one (the primary) sits at 72 on load, the second sits at 80 (after a quick 2 hour stint on crysis at max settings and 8xQ aa)
 
The 4870x2 had such a hit in FPS with enabled PhysX becasue the calculations had to be done on the cpu which means obviously youll take a hit in FPS where with the Nvidia card the calculations are done on the GPU meaning greater FPS.

People need to realize what exactly PhysX is and does before judging it and performance vs eye candy.

PhysX is nothing but added on eye candy which is going to actualy stress whatever its run on more. The GTX had the upper and becasue it can calculate PhysX faster than a CPU in case of the 4870x2. So if you enable that PhysX button expect a drop in FPS, not an increase. Vantage on the other hand gets an increase becasue the calculations are done on the GPU ratehr than the CPU so obviously youll get more FPS. Games though, are the opposite.

Also, saying lack of PhysX games is actualy quite untruthful. There are alot more PhysX enabled games than you might think. Devs just dont make a big stink about it like they did with UT3....which imo was a flop.
 
afaik physx was implementable because of the unified shader unit things.

Because the 8 series were the first to use these they are the first to be able to use physx, i think its the same in the 280's they have SO MANY unified shaders they have more spare for doing physics calculations. thats my view on it anyway? i don't think there is any EXTRA hardware per-se its just the way the code was adapted, and the shaders adapted to be able to calculate complex physics?

I got a physx card when i got my 8800gts before the ownership change and can compare the performance between the 2 and honestly (oddly) the 8800 kicks its ass :| which is shocking.

but then those 'unlock a 100% performance gain' drivers never appeared for the physx cards that they were on about on release.
 
Back
Top