30 fps vs 60 fps

Hey guys, my friend just will not accept that in games there is a big difference between 30 fps and 60 fps, he thinks the human eye can not see past 30 so there is no difference so I was wondering if you could all just post saying that there is a difference as he won't believe me? Cheers
 
Well I guess it depends what games you play, Fifa 12 for example is un playable under 40 although i guess with other games its not so bad but under 30 is
 
I'd say there is a difference between 30 and 60 in gaming...

Maybe not 40 and 60 though.

He is kind of right, eyes cant properly deal with around 30fps. BUT, 30 is average, and during the more graphically intense moments, this will probably drop to about 20/25 temporarily and so you will notice that. If you had a card that could keep frames per second above 35, never dropping below, then I dont think there's any difference between that and one doing 60.
 
As M&P Stated if the minimum frame rate stays above 30 game play should be fine. That is the case for me as well, but it will be different for others.
 
lol still with this "human eye" stuff. i vindicated my purchase of a 120hz monitor when my mum easily saw the difference between 60hz and 120hz on the same monitor without her glasses on. when you don't have motion blur, each frame is very much visible and i don;t think it's a stretch to say that some people will be able to tell the difference between 120 and 240 (when such a monitor actually exists). i think you have to have quite poor eyesight not to tell the difference to be honest! to say there isn't a difference between 30 and 60 is nonsense.
 
lol still with this "human eye" stuff. i vindicated my purchase of a 120hz monitor when my mum easily saw the difference between 60hz and 120hz on the same monitor without her glasses on. when you don't have motion blur, each frame is very much visible and i don;t think it's a stretch to say that some people will be able to tell the difference between 120 and 240 (when such a monitor actually exists). i think you have to have quite poor eyesight not to tell the difference to be honest! to say there isn't a difference between 30 and 60 is nonsense.

I have to agree. I have always found anything below 60fps seemed blurry at times and the game did not seem smooth. It is especially bad in fps games because the moving target gets blurry and thus makes it that little bit more difficult to hit.

Since I got my 120hz monitor I have noticed when frame rates drop to 60fps the game is not as smooth looking as it would be at 100+. This would stand to reason as there is less of a gap between frames if you get me.

Now the difference isnt major as in I can play games comfortably around 45-60fps and not notice it but its when the game drops from a higher fps to around 60 I notice the drop.

They also say you won't notice the increase when you get a 120hz monitor that much but it is when you go back to a 60hz monitor you instantly can tell something is different.

Hope that makes sense.
 
I posted about this just earlier today in another thread (here).

Basically, this is what I posted, to save you going to look:

This made me want to find out more and it turned out to be quite hard to find good information about FPS and how the human eye perceives them. Wikipedia wasn't much help either as the section they have on it is marked as needing expert input and more information. What I did learn is that most people would definitely notice flicker at 30fps or less without motion blurring, but this does not mean you can't discern frames at over 30fps, just that as it gets above 30 you'll be increasingly less likely to notice them until it reaches the flicker fusion point. Talking about frames alternating between all black and all white, it says

the flicker fusion point, where the eyes see gray instead of flickering tends to be around 60 FPS

meaning that it would need around 60fps to not be aware of the frames. It then goes on to say that things get much more complex if you have moving objects in the scene and that the faster they move, the higher this flicker fusion point will be. More complex still is that motion blurring will lower the fps needed for smooth display.

Basically, it's definitely higher than 30fps and probably more like 60fps and above but it really depends what you are looking at and how (well) it was made.
 
Has everyone forgotten that the majority of tv's only run between 24 to 30fps. If a game is unplayable at 30fps then its due to other reasons, not the frame rate.

Edit:

Maybe other reasons like frame lag make games unplayable at lower FPS. It seems like that all the gamers want massive framerates for next to no reason. Videos look fluid at 24 fps, 30fps will add some fluidity to the image, 40 fps may make fast moving objects appear clearer, 60fps is unessary. Remember were talking about true frames per second that are fed out in a linear fashion, ie no lagging frames.

Infact I just thought of a good experiment, grabe a strobe light, set it to 25hz see if you can see it flashing and keep increasing the frequency till it looks like a solid light.

lol wish i had a strobe light
 
Frame rate isn't the issue. Whether you play at a fixed sync and your computer can keep up to it is. This is where being able to be over 25 (30 ideally) is the eye question. 40-60 is regarded as being at a level where it doesn't matter, almost twice the eye.

If your computer takes too much time to draw and goes beyond vertical blanking (the height of a screen more or less), you'll get a flash or missed frame.

With sync off, the start of drawing isn't fixed to the top of the screen and you get a fine line/ripple.

Monitor's frequency has no bearing on this. Whether it's 120/100/75/60/50/45, this is only the refresh rate of the output from the video source. The source is always going to be there no matter if your computer can handle the game it's playing or not.

For example you can run Crysis on a Pentium 4, 1G, 9400GT and max all the settings, to get a 1fps - your monitor will still be using 120/100/75/60/50/45. But sure, you'll have a great picture to see your computer struggling.
 
Has everyone forgotten that the majority of tv's only run between 24 to 30fps. If a game is unplayable at 30fps then its due to other reasons, not the frame rate.

Edit:

Maybe other reasons like frame lag make games unplayable at lower FPS. It seems like that all the gamers want massive framerates for next to no reason. Videos look fluid at 24 fps, 30fps will add some fluidity to the image, 40 fps may make fast moving objects appear clearer, 60fps is unessary. Remember were talking about true frames per second that are fed out in a linear fashion, ie no lagging frames.

Infact I just thought of a good experiment, grabe a strobe light, set it to 25hz see if you can see it flashing and keep increasing the frequency till it looks like a solid light.

lol wish i had a strobe light

Film and TV are not comparable to games. Motion blur is the biggest reason. The filmmakers trick the eye into thinking you are seeing a smooth and consistent framerate when you are not.
 
Film and TV are not comparable to games. Motion blur is the biggest reason. The filmmakers trick the eye into thinking you are seeing a smooth and consistent framerate when you are not.

This is really where you need to seperate your thinking between the video signal and the production of the picture.

The video signal is a fixed refresh rate, governed by the source (computer, dvd player, sky box, whatever) and the capability of the viewing platform (tv, monitor, projector). Your monitor will be capable of certain rates, and the computer (gfxcard) will be capable of others (usually 100s at different resolutions), they compare notes when they meet via a dvi cable or whatever, and where they agree you have a list to chose from. 120/100/75.. blah blah. Replace the tv for the monitor and sky box for the computer and the basis is the same. If they compare notes and nothing matches, you get no picture.

... but your game will still be playing @ 30 or 60 fps.
 
Back
Top