XFX 8800 GTS XXX problem

name='macgamesrule' said:
No matter what card you use, you're going to be limited by your monitor's refresh rate. Most LCDs only do 60Hz (a few do 75), so while your card may be generating 300+ fps, the monitor is only going to show you 60 of them. Just turn on Vsync, otherwise you will get some jerkiness due to the monitor trying to change the image before it's finished drawing the previous one. I much prefer to play my games at constant 60fps with Vsync than 250fps without.

I disagree.. It doesn't jerk it tears. So you feel no lag. But on some games recently I have had Vsync on and it feels laggy.. turn it off and I get loads of frames and speeed. No mouse delay of anything.

But I can put up with screen tear.. some games I find are worst than others. :p
 
when i reviewed a 320MB foxconn 8800gts card, i also noticed that COD2 sucked.

i tested a 7950GT with 512MB as well and was totally shocked when it wiped the floor with the 8800GTS - but only in COD2
 
name='Pyr0' said:
when i reviewed a 320MB foxconn 8800gts card, i also noticed that COD2 sucked.

i tested a 7950GT with 512MB as well and was totally shocked when it wiped the floor with the 8800GTS - but only in COD2

8800 has very different architecture.. Bare in mind the memory difference here too... :)
 
name='Toxcity' said:
8800 has very different architecture.. Bare in mind the memory difference here too... :)

i realise there is a difference in memory etc, but surely the huge difference in FPS can not be attributed to the difference in memory alone ;)

i'm guessing nvidia drivers have a part to play in this too :p

i'm not quite sure what you mean by "8800 has very different architecture.."

we know the internals of the GPU, shaders etc. work differently, but the unified shaders should give the G80 the approximate equivalent shader power of two 7900 cards
 
name='Pyr0' said:
i realise there is a difference in memory etc, but surely the huge difference in FPS can not be attributed to the difference in memory alone ;)

i'm guessing nvidia drivers have a part to play in this too :p

i'm not quite sure what you mean by "8800 has very different architecture.."

we know the internals of the GPU, shaders etc. work differently, but the unified shaders should give the G80 the approximate equivalent shader power of two 7900 cards

If you are correct about the shaders then maybe there is something a foot.. :)

Good point about the drivers.. Nvidia have had a year to sort the drivers out for the G70s.. so hopefully they will do the same for the G80s!

Oh and memory has lots to do with FPS.. but not if you said "the 7950GT wiped the floor with it".

Im stumped! :p
 
hehe, me too m8 :p

the 7950GT was clocked to 7900GTX speeds (650/1600)

and the Foxconn 8800GTS is an oc version (575/1800)

1280x1024 w/4xAA and AF

7950GT

min=38 FPS

max=85 FPS

avg=57.63 FPS

8800GTS 320MB

min=23 FPS

max=47 FPS

avg=34.34 FPS

in the other games i tested (F.E.A.R., Oblivion, Need For Speed: Carbon, Tomb Raider Legend, Rainbow 6 Vegas) the 320MB 8800GTS was ahead at both 1280x1024 and 1680x1050

only cod2 showed this strange behaviour

i also tried playing with the AA settings (setting x4AA and enhance app in the nv control panel) since there was aupposed to be a bug with the in-game settings, but it made no difference whatsoever
 
Woah! Now that is some strange results..

What version of the game where you running?

I remember there being a bug on the graphic settings of the earlier version. :eek:
 
yeah i only got that problem in cod 2, and a high fps as i said affects the game engine , allowing you to shoot faster and do certain jumps ,infact some parts of the maps are only accesible if you have 333 constant , example carentan to go on the roof of a small shed you need 333 to jump it or else you can forget making the jump
 
Back
Top