Latency frame testing methods, thoughts?

AaronCooper

New member
Hi all,

I've been following an article on a different website (won't link, not sure if it's against ToS). However it's extremely interesting and wanted to get your thoughts on this testing method.

Basically a hardware review site decided to change it's comparison methods from average FPS to latency-per-fps. What this means is each frame is timed for how long it takes to load the image, this is then compiled into a graph which gives a visual representation of the results.

For example they tested a Radeon HD 7950 against a GTX 660 Ti.
The average FPS of the 7950 is 69FPS and the 660Ti averages 74.

However when you compare that to the results of this graph, you can see that although the 7950 isn't far from it's competitors FPS, the latency spikes in producing those frames are far higher.

skyrim.gif


So, what does this all mean in terms of performance? Well, the faster the card can produce the frames, the smoother the animation.

AMD have responded saying they are going to release driver updates to try and fix these issues. I think it's great that websites such as this one are constantly pushing for more in-depth reviews on products to find any flaws they can. At the end of the day, the consumer wins.

Do you think this method of testing is flawed or is it soon to set the standard when benchmarking?
 
Back
Top