HD 7950 vs R9 290 vs GTX 780 - Some Benchmarks

Zoot

Active member
I've been meaning to post up these for some time, since I've gone through a number of GPUs as of late. It's only a handful of games (they're the only ones I've installed with Built-In Benchmarks), but they might be useful to somebody, particularly if they've a similar system and are mulling a GPU upgrade.

Here are the cards.

Sapphire HD 7950 Boost OC Vapor-X
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1157&pid=1547&lid=1

Sapphire R9 290 Tri-X
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2091&leg=0

Asus GTX 780 OC DirectCU II
http://www.asus.com/Graphics_Cards/GTX780DC23GD5/

This is my System:

CPU: AMD FX-8350 @ Stock
Motherboard: Gigabyte GA-990FXA-UD5
Memory: 16GB Corsair ValueSelect DDR3 @ 1333MHz
SSD: Samsung 840 250GB (For both OS)
HDD: Western Digtal Caviar Green 2TB (x2) (Games & Storage)
PSU: Corsair AX760
OS: Windows 7 x64 Service Pack 1

Nvidia Drivers: Forceware 332.21
AMD Drivers: Catalyst 13.12

The resolution in all cases is 1600x1200 (I need a new monitor), and all the settings are maxed.

Here are the plots:

ElkZbHI.png


MfsdAFU.png


8onB09V.png


JXEgnn9.png


hg9n1UU.png
 
Last edited:
Gee, can you tell Arkham Origins is a game that came with Nvidia GPUs? ;D
Something is really weird with your Sleeping dogs scores, I'm pretty sure my old 7870 wasn't far off that (~5-10fps) maxxed out @1080p. 99% gpu usage though.

-edit- Am I still right in thinking that Nvidia GPUs actually work better with AMD cpus than AMD gpus do or is that some old wives tale I heard?
 
Hitman is AMD optimised and part of AMD's Gaming Evolved program like Batman is Nvidia optimised and part of their.. not sure what it is called progeram :)

You will always see it perform better on AMD GPUs than you will on Nvidia's.
Hitman is also pretty CPU demanding as well, so you will get slightly better performance with a better CPU.

Interesting review on Hitman Absolution here

-edit- Am I still right in thinking that Nvidia GPUs actually work better with AMD cpus than AMD gpus do or is that some old wives tale I heard?

Never heard this myself, I wouldn't of though it was true though.
 
Last edited:
Gee, can you tell Arkham Origins is a game that came with Nvidia GPUs? ;D
Something is really weird with your Sleeping dogs scores, I'm pretty sure my old 7870 wasn't far off that (~5-10fps) maxxed out @1080p. 99% gpu usage though.

-edit- Am I still right in thinking that Nvidia GPUs actually work better with AMD cpus than AMD gpus do or is that some old wives tale I heard?
Well with Sleeping Dogs I do have Super-Sampling turned on which really kills the frame rate. The minute I turn that off, the frame rate goes through the roof in comparison.

Practically speaking, out of the games up there it's the only one to have a noticeable difference for me. It's pretty playable with Super-Sampling for me with the GTX 780 and it was on the R9 290, whereas I couldn't really play it with the 7950.

For the second part of your post, it's an old wives tale. :p
Nvidia/AMD want to be able to sell you a GPU no matter what CPU you're running, both of them would be silly to cripple their GPUs if they were paired with either Intel or AMD CPUs.

Hitman Absolution must have terrible coding.

I've noticed that in recent reviews. For a 780 to be dropping to 15 FPS - especially at 1600x1200 is ridiculous.
I never really paid too much attention to it, although now that you mention it the results are extremely funny indeed. I'm not too bothered about it though given it's a crap game anyway. :p

BTW, the 15fps figure would be a part near the start of the benchmark as far as I'm aware, it's pretty smooth throughout the remainder.
 
Last edited:
You should post the clocks of the 3 gpus. Would help us out a little bit...
Click on the links. :p

Thanks OP! Now I don't feel too out of date having 7950s. :)
Your 7950's have plenty life left in them yet.

TBH it's only a few games I can actually notice a difference with the 780 over the 7950 since I was getting over 60fps in pretty much all my games with that anyway. Bit of a pointless upgrade when you look at it like that. ^_^

Although I do plan on getting a new monitor later this year, it'll either be a 1440p one or a 120Hz 1080p one, should be set for that now with the 780.
 
Gee, can you tell Arkham Origins is a game that came with Nvidia GPUs? ;D
Something is really weird with your Sleeping dogs scores, I'm pretty sure my old 7870 wasn't far off that (~5-10fps) maxxed out @1080p. 99% gpu usage though.

-edit- Am I still right in thinking that Nvidia GPUs actually work better with AMD cpus than AMD gpus do or is that some old wives tale I heard?

Nope it wasn't an old wives tale at all you are absolutely right. It's not a case of anyone crippling cards on certain CPUs or anything it was just down to the lack of memory bandwidth provided by the AMD CPUs hurt the GCN graphics cards more than the Kepler ones due to the architecture design.
 
It's not a case of anyone crippling cards on certain CPUs or anything it was just down to the lack of memory bandwidth provided by the AMD CPUs hurt the GCN graphics cards more than the Kepler ones due to the architecture design.
Tom's Hardware did an article on that last year, although they finished it off in quite a trollish way.
http://www.tomshardware.com/reviews/crossfire-sli-scaling-bottleneck,3471.html

They make a big deal over the difference, where in reality, the difference is so tiny, the claim is almost meaningless, particularly since you'll never notice the difference yourself anyway.

This is one of the reasons I've lost a lot of faith in quite a few review sites over the last year where they make a massive deal over differences that simply don't matter. However that's a rant for another place and time. :p
 
This is the worst myth ever and i have no clue why it was brought up...

The software is what makes the difference to how the hardware will perform. Comparing two different GPUs even with the same CPU is not a real comparison as the software is not created equal with different hardware along with the fact the the architecture's are different with the GPUs and won't receive the information the exact same way the other GPU will either.

Now for a FPS comparison for pure performance then its as close as you can get to a comparable test(except for some software is purely biased) but in the case of do this myth it is not because the bias's from the software is still carried over from GPU to GPU. So really the article is pointless.
 
Last edited:
Back
Top