OC3D AMD Fiji Owners Club

Yeah, that's what I've been seeing in reviews as well. I was a tad concerned they were being sent golden samples of the Fury—if you call that kind of an overclock a golden sample, what with how spoilt we've become with Maxwell—so I'm glad to hear its consistent for average consumers as well.


As for monitors, I had originally intended on buying an ROG Swift or the Acer Predator. G-Sync is a superior version of VRR in general

Not superior, they are basically identical. Just two different ways of doing it. What they both have as a downside is they increase input latency a little bit compared to having it off. All the reports and data suggest they are nigh on identical. One is just cheaper and open source:p


My Fury X for example can only hit 1120/500 without volt tweaking. It's a below avg chip I guess:(
I'll probably being RMA'ing it though and hopefully get a better one. The AIO Fan is starting to make really loud coil whine and motor noises. Issue just started arising but it's louder than the minimal coil whine from my actually Fury X. So probably time to RMA.
 
Not superior, they are basically identical. Just two different ways of doing it. What they both have as a downside is they increase input latency a little bit compared to having it off. All the reports and data suggest they are nigh on identical. One is just cheaper and open source:p


My Fury X for example can only hit 1120/500 without volt tweaking. It's a below avg chip I guess:(
I'll probably being RMA'ing it though and hopefully get a better one. The AIO Fan is starting to make really loud coil whine and motor noises. Issue just started arising but it's louder than the minimal coil whine from my actually Fury X. So probably time to RMA.
I HIGHLY doubt I'd be able to tell the difference between G-Sync and FreeSync if implemented in very similar way. I'm really looking forward to seeing the impact, even more than 1440p to be honest. I probably shouldn't hype it too much. :p

I don't have a lot of faith in AIO coolers. I've seen too many of them with annoying pump noises, including my personal AIO from Cooler Master. I'd rather the sound of air 'whooshing' than the grind of a tiny pump. I don't see the point in water cooling a card that can't overclock with voltage anyway. Water cooling is for quietness, yet the pump is noisy (for some). Water cooling is for overclocking, yet the card is locked, and even with voltage scales poorly. Water cooling is for looks, yet you're tied to a very specific (although sexy) design that AMD decides. Unless they couldn't wrangle in the power consumption and temperatures of the Fury X with an air cooler, I feel the Fury X is too much of a niche product. But maybe I see it that way because it doesn't fit in with my ideas, while it might for a lot of other people. Which means it's not a niche, it's that I don't like it. :p
 
I HIGHLY doubt I'd be able to tell the difference between G-Sync and FreeSync if implemented in very similar way. I'm really looking forward to seeing the impact, even more than 1440p to be honest. I probably shouldn't hype it too much. :p

I don't have a lot of faith in AIO coolers. I've seen too many of them with annoying pump noises, including my personal AIO from Cooler Master. I'd rather the sound of air 'whooshing' than the grind of a tiny pump. I don't see the point in water cooling a card that can't overclock with voltage anyway. Water cooling is for quietness, yet the pump is noisy (for some). Water cooling is for overclocking, yet the card is locked, and even with voltage scales poorly. Water cooling is for looks, yet you're tied to a very specific (although sexy) design that AMD decides. Unless they couldn't wrangle in the power consumption and temperatures of the Fury X with an air cooler, I feel the Fury X is too much of a niche product. But maybe I see it that way because it doesn't fit in with my ideas, while it might for a lot of other people. Which means it's not a niche, it's that I don't like it. :p

I doubt it too because they already are implemented in a similar way:p

The voltage isn't locked. Just waiting for the 3rd party suits to hurry up and release ones that support these dang cards. Not one has been updated yet. Very annoying. Oh and how can you say voltage scales poorly when you can't change it?:D:p
 
I doubt it too because they already are implemented in a similar way:p

The voltage isn't locked. Just waiting for the 3rd party suits to hurry up and release ones that support these dang cards. Not one has been updated yet. Very annoying. Oh and how can you say voltage scales poorly when you can't change it?:D:p
I thought voltage was locked within Afterburner and all that?

Do you mean that AIB vendors will be releasing newer versions of the Fury X with voltage scaling?


Did you see the article from Techpowerup who found a way to unlock the voltages?

https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/2.html

An extra 150w for 3-6 FPS. I can't tell whether that's a 4K benchmark—if it is, 5 FPS is pretty decent—but it's not good scaling, IMO.
 
I thought voltage was locked within Afterburner and all that?

Do you mean that AIB vendors will be releasing newer versions of the Fury X with voltage scaling?


Did you see the article from Techpowerup who found a way to unlock the voltages?

https://www.techpowerup.com/reviews/AMD/R9_Fury_X_Overvoltage/2.html

An extra 150w for 3-6 FPS. I can't tell whether that's a 4K benchmark—if it is, 5 FPS is pretty decent—but it's not good scaling, IMO.

It's locked because the software can't read whatever it is it needed to read. They need to update their software so it's reads and can control everything correctly. So just waiting for AB'er or Trixx etc, to update the software.

I also find that "review" bogus and invalid. Voltage never scales with FPS.. it scales with power consumption and heat. Clocks scale with performance. Add on to the fact he didn't mention how he did it. Those clocks could have been obtained with just a cherry picked core and avg HBM
 
Not superior, they are basically identical. Just two different ways of doing it. What they both have as a downside is they increase input latency a little bit compared to having it off. All the reports and data suggest they are nigh on identical. One is just cheaper and open source:p


My Fury X for example can only hit 1120/500 without volt tweaking. It's a below avg chip I guess:(
I'll probably being RMA'ing it though and hopefully get a better one. The AIO Fan is starting to make really loud coil whine and motor noises. Issue just started arising but it's louder than the minimal coil whine from my actually Fury X. So probably time to RMA.

1120/500 is average for a Fury X on stock volts and cooler.

I have benched 6 Fury Xs so far.

A golden one will do 1150/500 but it is not stable.

If you change the cooling to custom waterblocks the cards will do between 1140/500 and 1150/500.

I think the cards are very sensitive to temps. Mine with custom water never go over 36c.

As to the air cooled Fury from the reviews I have seen they overclock to about 1100/500, possibly due to the air cooler not being as good as the AIO on the stock Fury X.
 
It's locked because the software can't read whatever it is it needed to read. They need to update their software so it's reads and can control everything correctly. So just waiting for AB'er or Trixx etc, to update the software.

I also find that "review" bogus and invalid. Voltage never scales with FPS.. it scales with power consumption and heat. Clocks scale with performance. Add on to the fact he didn't mention how he did it. Those clocks could have been obtained with just a cherry picked core and avg HBM
I'm not sure I'm understanding you right. Clocks do scale with performance, yes, but doesn't clock scale with voltage?
 
I'm not sure I'm understanding you right. Clocks do scale with performance, yes, but doesn't clock scale with voltage?

In the article you linked they are comparing the scaling of volts to FPS. That's not how it works which is why I think it's bogus and invalid. He also claims to have unlocked the voltage and doesn't say how. That init itself invalidates the whole thing.

1120/500 is average for a Fury X on stock volts and cooler.

I have benched 6 Fury Xs so far.

A golden one will do 1150/500 but it is not stable.

If you change the cooling to custom waterblocks the cards will do between 1140/500 and 1150/500.

I think the cards are very sensitive to temps. Mine with custom water never go over 36c.

As to the air cooled Fury from the reviews I have seen they overclock to about 1100/500, possibly due to the air cooler not being as good as the AIO on the stock Fury X.

I just don't see how temps are the issue. They don't even rise that high? At my max clocks it only reaches 60C and even then I set the fan to turn up a lot to keep it closer to 50C and it didn't make any difference to stability. No matter my temp anything beyond 1120 it artifacts. That was tested with 3d mark. Unigene(Valley and Heaven) for some reason crashes and freezes my pc so can't run those. Tested it with TW:Attila and had artifacts there too
 
Last edited:
In the article you linked they are comparing the scaling of volts to FPS. That's not how it works which is why I think it's bogus and invalid. He also claims to have unlocked the voltage and doesn't say how. That init itself invalidates the whole thing.
They're scaling FPS with power consumption, which is my personal contention with the card. Not everyone cares about that, but it is still a potential reason why this card is not a great overclocker, and why the design may be fundamentally flawed in that area. A 980 is a great overclocker. You add 100w for a huge performance boost. With the Fury X, you add 150w for an average performance boost. Obviously the architectures are completely different, but it still shows where AMD is suffering, despite their claims about HBM being so efficient—which it is. The 980ti is a good overclocker, but it needs water cooling for high overclocks unless you want to crank three fans to 3000RPM. With overclocking the 980ti, heat is a concern, as is power consumption, but it does bring a lot of additional performance.

I wouldn't say the test is invalidated just because they didn't specify how they did it. Maybe they were requested not to disclose until a specific date, or maybe they felt they wanted to hold to the methodology to better their site and business, which would be a fair thing to do.
 
So I installed my new ASUS Fury Strix today. So far, I'm happy.

It's quieter than my G1 Gaming 970 (at load and idle)
It's only about 8°C hotter (settled at around 69°C in GTA V)
No coil whine
The card feels well built
It looks better than my G1 970

Performance isn't earth shattering in GTA V at 1080p, but I knew my overclocked 970 was only going to be slightly inferior in that game. That was my only real gripe with the Fiji line-up, GTA V performed much better with Maxwell. I'm going to overclock the Fury shortly. Hopefully I'll gain around 5 FPS at 1440p. That'll help when I set up my new BenQ XL2730Z. I'm currently testing the card on my 1080p screen in case there is an issue and I need to go back to the 970.

I ran Fire Strike normal once just to check it was working. No overclocks or tweaks. I'm using the 15.8 Beta drivers. I've seen it offer better performance than the official 15.7.1. My score is higher than Guru3d's in their review of the card. Either the drivers are helping out or my silicon is scaling well.

http://www.3dmark.com/3dm/8580814?

Gonna try and hit 16000 graphics score. I don't know if it's possible, but we'll see. That will then be matching the Fury X at stock.
 
I apologise for the triple post. I just thought I'd update my progress with the new GPU, for anyone interested.

Fire Strike was not stable at 1080Mhz. Heaven was not stable at 1070Mhz. GTA V would not even load at 1070Mhz. This card is the worst overclocker I've seen in a while. Like, it's appalling. It's not even worth it. I gained 3.5 FPS in Tomb Raider from 1000/500Mhz to 1060/550Mhz at 1080p. That will be less in 1440p when I replace my 1080p monitor with my new BenQ. So much for that dream of hitting 16000 in Fire Strike at 1100/550Mhz. This is my second really poor performer in a row. Really annoying I could not hit what the average is. I'm nowhere near it. I don't care about having the best. I just want the average. My CPU hits 4.5Ghz at 1.3V. This is average. I'm happy.

This is my highest graphics score, and it wasn't even stable in other tests.

http://www.3dmark.com/3dm/8590117?
 
Last edited:
If on W10, I also have some trouble with stability on benches. Sometimes they run at 1100mhz OC'd sometimes they crash. Same with stock for me too. 3dmark hasn't been updated in ages and for me personally on either my 390x or Fury X Heaven/Valley locks up my PC(3 different drivers). It's probably just software.

Although in my experience, Games always crash at your highest bench clocks. For me I always had to drop them a little bit to get them stable. Depends on the game but for the games I play that's my experience.
 
If on W10, I also have some trouble with stability on benches. Sometimes they run at 1100mhz OC'd sometimes they crash. Same with stock for me too. 3dmark hasn't been updated in ages and for me personally on either my 390x or Fury X Heaven/Valley locks up my PC(3 different drivers). It's probably just software.

Although in my experience, Games always crash at your highest bench clocks. For me I always had to drop them a little bit to get them stable. Depends on the game but for the games I play that's my experience.
Windows 7. 15.8 Beta drivers.

Yeah, games usually test my graphics cards more. Whenever I find my max stable benching overclock, I reduce it down by about 20-50Mhz (with nVidia). It's not worth risking ruining a game for 1 FPS.

I might overclock it to 1060/520Mhz or something like that, just to satiate my cravings. The performance gained will be so small, but it's something.
 
myxF8xB.jpg

TtU11de.jpg

1d8vErd.jpg



I need to change out one of the 8-pin PCI-e cables. It used to be a red 6-pin, but James from Pexon was kind enough to build me a spare 8-pin when I ordered the 8 and 6-pin cables for my system. It's just, I really feel like red would tie the looks together better, so I might contact him. I have to send back a dead cable anyway.
 
Can anyone offer an opinion on the legitimacy of this report:

http://wccftech.com/amd-r9-fury-x-performance-ahead-nvidia-980-ti-latest-drivers/

I have always relied on TechPowerUp for the most detailed graphs of performance, but are there are logical reasons for Windows 10 offering such a big performance gain on AMD? Maybe there is for the Fiji chip, but did AMD plan the 200 series 4-5 years ago with Windows 10 in mind? Surely Windows 10 was nothing more than a basic concept back then.

I'm not into the wars between the two competitors; I don't care who's more powerful. I care whether the card I want is powerful enough for me and for the games I play. If upgrading to W10 sooner than I had planned results in such a performance boost, that's exciting. But I'm not sure where TechPowerUp is getting their Performance Summary from.

For example, here is the 980ti Lightning review at 1440p with Windows 10:

gtav_2560_1440.png


Here is the ASUS Fury review (my card) at 1440p with Windows 7:

gta5_2560_1440.gif


No performance gain.


And in The Witcher 3, unless TechPowerUp have changed the settings of their benchmark, performance drops massively with Windows 10 on all GPU's:

Windows 10:

https://tpucdn.com/reviews/MSI/GTX_980_Ti_Lightning/images/witcher3_2560_1440.png

Windows 7:

http://tpucdn.com/reviews/ASUS/R9_Fury_Strix/images/witcher3_2560_1440.gif


Far Cry 4 with Windows 10:

https://tpucdn.com/reviews/MSI/GTX_980_Ti_Lightning/images/farcry4_2560_1440.png

Far Cry with Windows 7:

https://tpucdn.com/reviews/AMD/R9_Nano/images/farcry4_2560_1440.gif

Again, no performance gain.



Anyone know what's going on here? Is that article nonsense and TechPowerUp are losing it with their mathematics?
 
Whats your coil whine like on the Strix card?. My Sapphire Tri OC coil whine is starting to me off and I've only been living with it 2 days!.

Tried lowering clock speeds, overclocking and same result. Thinking if I RMA it i'll probably get the same result!.
 
Back
Top