The card has a very good performance for it's costing. Comparing it with a gt200 still feels like comparing the gt200 to a 3870 when it came out. No amount of statistics are going to judge the 58x0 series until the gt300 is about.
name='w3bbo' said:
Maybe I misunderstood you m8 but we already do the highest possible settings (2560x1600 8xAA), medium settings (1920x1200 4xAA) and low settings (1600x1200 0xAA)? This is done with all cards on test. The [H] review (imo) presents the results in a really confusing way making direct comparisons difficult.
Or did you mean something different?
Thing I like about their arrangement, although I also conceed that it's slightly confusing, is that they've made the realization of something I was banging on about last year.
The matter that a 5870 plays a game at 150fps and a 5850 does it at 130fps - in the real world - doesn't mean anything. If the 5850 can however output the same quality screens as the 5870, a 5870 isn't worth getting. This does make the assumption of games to come, but we have to face facts that Crysis isn't new and it's still being used as a bencher.
What I've pushed for for a while is that benchmarks don't mean anything over a certain point. 200fps and 150fps ? so what ? Your going to notice when ?
What they seem to be agreeing with over at [H] is that u have a threshold for fps, nominally 3x fps, and see what quality u get from the card(s). Typically showing the graphic settings that were allowed when reaching those fps.
It is almost like a reversal of traditional gaming thinking. For years previous, certainly with dx8/9, fps has almost been the be-all-and-end-all. Not much else mattered. But more recent, possibly the only thing to attribute to the Dx10 generation, the quality of play counts for that much more. Fps, certainly over 25fps, can be achieved with most mid+ range cards over the last 2 years. The nvidia or amd beat the other by 5/10% in each release - so what. What the market, imo, should be increasingly looking at is the quality of the end result.
Untypically this will involve the likes of PhysX/Havoc/Cuda/Stream, shader quality (not speed), and increasingly, imo, memory. I was amazed that a game such as GTA4, and WorldInConflict(I think it was), demonstrate in their preferences that if u have a 1G card, u can-not use the highest settings. In these examples a GTX395 with 2G will get spanked quality-wize by merely a HD4870 with a proposed 4G. Laugh at the ASUS 4G vhs cards now.
I have little doubt that a GTX380 (or whatever) will cain the arse off possibly even a 5895x2, but for me it will be the quality of the end result and not the extra 5% fps over the 150fps it gets.
[H] are almost there, but I agree it's a little confusing. A better depiction will come I'm sure.