AMD R9 Fury X benchmarks leaked, Significantly Faster than the GTX 980Ti

Kinda why I only mentioned the 290(x). Either way, apparently the fingers are OK for 1080p, not perfect though.

Ya I was elaborating on it:)
The fingers can transfer the data fast enough, it's just more efficient to do it through the bus as there is much more headroom there with PCIe 3.0 and I'd guess lower latency playing a small part, though there's no data I know of that would explain the differences, I'd be interested in reading that.
 
Too bad AMD didnt get much praise for XDMA, funny how it works. I mean its not like they invented anything :v

XDMA is the way to go all in all.

Oh and look at that 1111 posts :D


I can honestly say I never had a single problem with AMDs drivers aside from the black screen on Display port for 4k monitors under some circumstances. That was something that both sides had though.
 
Last edited:
Just did some more tests on the Titan X which is roughly the same performance as the 980 Ti albeit around 5% faster but I was getting on average 100FPS in Sleeping Dogs using the same settings at 4K the AMD review paper was.

I ran it again with an older driver version aswell and there was only a 5 FPS difference between drivers and with the Fury X "apparently" getting 61FPS these benchmarks don't look to be real.

I know there can be inconsistencies between benchmark runs but after having run the Sleeping Dogs benchmark over 10 times now and each time getting roughly the same result plus/minus a few FPS here or there there really isn't any major difference, Something doesn't add up with these benchmark numbers AMD have released.

I expect Fury X performance to be better than what is stated in these benchmarks or AMD are about to release a turkey.
 
Last edited:
Why would you do something so Chrazey for :D

Quite interested in regards to how big a role HBM will be able to play though.
 
Last edited:
Just did some more tests on the Titan X which is roughly the same performance as the 980 Ti albeit around 5% faster but I was getting on average 100FPS in Sleeping Dogs using the same settings at 4K the AMD review paper was.

I ran it again with an older driver version aswell and there was only a 5 FPS difference between drivers and with the Fury X "apparently" getting 61FPS these benchmarks don't look to be real.

I know there can be inconsistencies between benchmark runs but after having run the Sleeping Dogs benchmark over 10 times now and each time getting roughly the same result plus/minus a few FPS here or there there really isn't any major difference, Something doesn't add up with these benchmark numbers AMD have released.

I expect Fury X performance to be better than what is stated in these benchmarks or AMD are about to release a turkey.

Dang, you're really OCDing over that crap ain't ya?

As I mentioned before, benchmarks are one of the things you will never, ever replicate. There are just too many variables.

Gibbo did say on OCUK that it's very good at 4k though.....
 
Dang, you're really OCDing over that crap ain't ya?

I just don't like fake numbers, It gives off the wrong impression and benchmark numbers can be replicated within a few % if the environment is roughly the same i.e if the same OS and CPU are used, I've seen it multiple times.
 
If you're going to go with bad drivers, I'll go with Nvidia lying to cover their backs :). Simplest answer as you say.

I own both Nvidia and AMD cards and while crossfire isn't as good as sli, single cards are both equal in terms of issues. Bad drivers is a tired cliché when it comes to talking about single cards. Excluding drivers built for reviewing it seems.

Nvidia have been sneeky (see the whole thing with 3dmark03 and the 5900 'ultra) before.

The simplest would be a marketing mistake. Also AMD do have bad drivers - this is a flat out fact. You can either pick between nonexistent drivers (you have to run the beta if you need anything soon e.g. more than once a year), or the limited draw calls.
I'm not familiar with either of those problems, and they seem to be over a decade old, so give credit where it is due, one mistake every 10 years or more is pretty good for any company in any industry.
 
I just don't like fake numbers, It gives off the wrong impression and benchmark numbers can be replicated within a few % if the environment is roughly the same i.e if the same OS and CPU are used, I've seen it multiple times.

You've got as much chance of proving them fake as you have being handcuffed to a ghost.

I do believe it's actually illegal to do so, and AMD have enough bad press as it is without faking benchmarks. Whatever you want to believe there's a very high chance they created those numbers under the circumstances they ran the benchmarks.

I benchmarked a specific scene in Crysis 3 and on one run I got a min of 23 FPS, then on another a min of 43. Quite a bloody difference tbh.
 
You've got as much chance of proving them fake as you have being handcuffed to a ghost.

I do believe it's actually illegal to do so, and AMD have enough bad press as it is without faking benchmarks. Whatever you want to believe there's a very high chance they created those numbers under the circumstances they ran the benchmarks.

I benchmarked a specific scene in Crysis 3 and on one run I got a min of 23 FPS, then on another a min of 43. Quite a bloody difference tbh.

I agree ingame benching can always be totally random but canned benchmarks like the ones built into Alien Isolation, Sleeping Dogs, Tomb Raider, Batman, Mafia 2, Dirt, Grid, GTA, Hitman, Metro etc... are always very consistent when it comes to benching cards across systems.
 
I agree ingame benching can always be totally random but canned benchmarks like the ones built into Alien Isolation, Sleeping Dogs, Tomb Raider, Batman, Mafia 2, Dirt, Grid, GTA, Hitman, Metro etc... are always very consistent when it comes to benching cards across systems.

There are just too many variables though dude. Seriously, it's quite impossible to replicate results.

For example when I first got my Titan Blacks I could get 150mhz over EVGA SC clocks on the driver I was using. Fast forward to the latest drivers and I can't even get 30mhz. The rig just locks up.

And this makes an enormous difference.
 
There are just too many variables though dude. Seriously, it's quite impossible to replicate results.

For example when I first got my Titan Blacks I could get 150mhz over EVGA SC clocks on the driver I was using. Fast forward to the latest drivers and I can't even get 30mhz. The rig just locks up.

And this makes an enormous difference.

If we're talking clocks then yes but if you can get within a few MHZ of said benchmarks then it's easy to compare.
 
The simplest would be a marketing mistake. Also AMD do have bad drivers - this is a flat out fact. You can either pick between nonexistent drivers (you have to run the beta if you need anything soon e.g. more than once a year), or the limited draw calls.
I'm not familiar with either of those problems, and they seem to be over a decade old, so give credit where it is due, one mistake every 10 years or more is pretty good for any company in any industry.

Nah it wasn't the only time they tried to cheat/lie. There are other examples from around the same time.

A fact based on what? Availability does not mean bad drivers. Just because they don't release a driver every few days doesn't mean the drivers are bad. As I mentioned previously, I own both cards and I've had equal amount of problems. So unless you want to admit they're both equally 'bad' I'll say that AMD make drivers that are fine.

You're starting the same argument on every thread about AMD.
 
Back
Top