OC3D Review: XFX HD 5850 1GB GDDR5 PCIe Graphics card

name='PeterStoba' said:
Who has one of them any more? I thought everybody upgraded to the 9 series! :rolleyes:

Nope, I've an 8800 GTX right here. Saw, and still see no point upgrading yet. It's going to take something pretty special for me to relinquish my GTX....
 
A pair of 4890's will still pip this baby to the post, but it is great value for money no question about that yet another red team win imho.

As far as the GT/GTX boys are concerned, if you used one of these there would be no going back ;)
 
This is a nice card for the price! £100 less than the 5870 and only 10-15% less performance.

I really like the review on [H] first highest possible settings for all cards, then 'apples to apples' all cards same settings, you ever thought of doing that?
 
Maybe I misunderstood you m8 but we already do the highest possible settings (2560x1600 8xAA), medium settings (1920x1200 4xAA) and low settings (1600x1200 0xAA)? This is done with all cards on test. The [H] review (imo) presents the results in a really confusing way making direct comparisons difficult.

Or did you mean something different?
 
name='PeterStoba' said:
Who has one of them any more? I thought everybody upgraded to the 9 series! :rolleyes:

I have 2 8800gtx's running in sli!!!

Amazing cards...

Considerring upgrading over xmas time to one or 2 of these though..
 
The card has a very good performance for it's costing. Comparing it with a gt200 still feels like comparing the gt200 to a 3870 when it came out. No amount of statistics are going to judge the 58x0 series until the gt300 is about.

name='w3bbo' said:
Maybe I misunderstood you m8 but we already do the highest possible settings (2560x1600 8xAA), medium settings (1920x1200 4xAA) and low settings (1600x1200 0xAA)? This is done with all cards on test. The [H] review (imo) presents the results in a really confusing way making direct comparisons difficult.

Or did you mean something different?

Thing I like about their arrangement, although I also conceed that it's slightly confusing, is that they've made the realization of something I was banging on about last year.

The matter that a 5870 plays a game at 150fps and a 5850 does it at 130fps - in the real world - doesn't mean anything. If the 5850 can however output the same quality screens as the 5870, a 5870 isn't worth getting. This does make the assumption of games to come, but we have to face facts that Crysis isn't new and it's still being used as a bencher.

What I've pushed for for a while is that benchmarks don't mean anything over a certain point. 200fps and 150fps ? so what ? Your going to notice when ?

What they seem to be agreeing with over at [H] is that u have a threshold for fps, nominally 3x fps, and see what quality u get from the card(s). Typically showing the graphic settings that were allowed when reaching those fps.

It is almost like a reversal of traditional gaming thinking. For years previous, certainly with dx8/9, fps has almost been the be-all-and-end-all. Not much else mattered. But more recent, possibly the only thing to attribute to the Dx10 generation, the quality of play counts for that much more. Fps, certainly over 25fps, can be achieved with most mid+ range cards over the last 2 years. The nvidia or amd beat the other by 5/10% in each release - so what. What the market, imo, should be increasingly looking at is the quality of the end result.

Untypically this will involve the likes of PhysX/Havoc/Cuda/Stream, shader quality (not speed), and increasingly, imo, memory. I was amazed that a game such as GTA4, and WorldInConflict(I think it was), demonstrate in their preferences that if u have a 1G card, u can-not use the highest settings. In these examples a GTX395 with 2G will get spanked quality-wize by merely a HD4870 with a proposed 4G. Laugh at the ASUS 4G vhs cards now.

I have little doubt that a GTX380 (or whatever) will cain the arse off possibly even a 5895x2, but for me it will be the quality of the end result and not the extra 5% fps over the 150fps it gets.

[H] are almost there, but I agree it's a little confusing. A better depiction will come I'm sure.
 
Thing is that if you get the same quality output from the cards that produce mental fps, then it is worth it as they can be kept for that much longer, just like Matt has done with his 8800GTX.

If at the time, he'd gone with a cheaper option just because it could easily handle anything at the time, then he'd not be able to play the current games at anywhere near top level with a decent size monitor.
 
Nah ur thinking at a tagent.

See the bit where I mention prior to Dx10 it was mostly to do with fps or nothing.

The 8800GTX was one of those Dx10 cards. If u wanted Dx10 november 2006, there wasn't any option that could compete with it.

Arguably buying a 8800 GTX/Ultra way back then, for the supposedly outrageous price at the time, was one of the best investments in a card a person could make. In hindsight.

Others have spent more on minor benchmark fps improvements, that make no real-game differences, and with little or no quality bonus. And they buy a new one each time a generation comes out every so many months.

G80 being one revolution. The GT200 being the stepping stone. I personally wouldn't bat an eyelid if the GT300 is sold at what would seem like similarly outrageous cash, considering the tech that's supposedly involved.

You can then equally peer over at others who've had a 4870, 275, 4870x2, 295, 5870 then a 5890 and possibly giggle. 8800GTX/Ultra owners, imo, have been able to do just that.
 
Back
Top