PowerColor shows off their Radeon R9 390X DEVIL GPU, but be warned, this is not AMD F

Many modern cards are design to increase their clock speed until they reach the set thermal limit. The 290/x does it the 280 does it the 960,970,980/ti does it, I guess you get my point.
Temp isnt a big deal until your card gets so hot that it gets louder AND starts to throttle.

As much as I would have liked to agree with you on that, I think you have mixed up AMD and nVidia.

"try so hard to start stuff"....What are you on about? I'm talking about GPU's, not quite sure what I'd be "starting"....

Look at the reviews I gave you where it is like-for-like coolers. If you wanted to OC the MSI R9 290, it runs normally at 85, so you are going to have a temp battle on your hands hence why it would be a leafblower.

The MSI card runs at 85C. How is that not hot? Clearly you're just a fan of AMD and for some reason are not seeing what the entire industry knows - Maxwell destroys Hawaii. I think saying that you shouldn't mention AMD's market competition in a thread about AMD shows your actual agenda...

PS. I'm running a R9 280X, clearly I'm baised towards Nvidia :rolleyes:

Mate, get your facts straight. Hawaii was competing with Kepler. The Hawaii chips are not DESTEROYED by Maxwell. The 290x is only clocked around 1000Mhz whilist the 980 is clocked at 1200Mhz and boosts. Yet the 290x is able to compete with the 980, why is that? Obviously this isnt the whole story, however its quite obvious that the the Hawaii chip isnt DESTEROYED by the Maxwell.

P.S I have owned many of both AMDs and nVidias cards, so clearly I am not biased towards AMD :rolleyes:
 
Last edited:
I'll just leave this here :)
3D%20mark%20benchmark_zpswv3bmv6b.jpg
[/URL][/IMG]
 
I have got it to 1200/1400 i'll push it further when I have some free time http://www.3dmark.com/fs/5024072

Well best of luck to you when the time comes :) Hopefully it is able to go further.

It does look like you are getting pretty close to its limit though. With for some reason they either suck or are really good. Odd. Most of mine were good though :)

Always did like that cooler though, just not a big fan of the colors.
 
Last edited:
As much as I would have liked to agree with you on that, I think you have mixed up AMD and nVidia.



Mate, get your facts straight. Hawaii was competing with Kepler. The Hawaii chips are not DESTEROYED by Maxwell. The 290x is only clocked around 1000Mhz whilist the 980 is clocked at 1200Mhz and boosts. Yet the 290x is able to compete with the 980, why is that? Obviously this isnt the whole story, however its quite obvious that the the Hawaii chip isnt DESTEROYED by the Maxwell.

P.S I have owned many of both AMDs and nVidias cards, so clearly I am not biased towards AMD :rolleyes:

Get your facts right - the 290X does not at all compete with the 980, the 980 is cooler, faster and that's before it's OC'd. The 290X is more comparible with the 970:

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-8.html

:cool:

If you're not biased then why are you completely confused about how AMD really doesn't offer competition against the 980 and beyond?
 
So basically are you telling me that the 290x never matches or beats the 980? I have already made it very clear that the 290x was never meant to compete with the 980 in the first place.

If your not biased then why are you picking and choosing the information that you want to accept?

The use of the word 'DESTROY' by you was also great btw. Cause thats clearly how it is between the 980 and the 290x (hopefully your not so delusional that you cannot even understand sarcasm)

With your mindset, I may as well be saying that since the 980 costs twice as much as the 290x, the 980 should perform twice as fast. Or was it 2 times better in every possible way. OH wait thats not how it works. How silly.
 
Last edited:
So basically are you telling me that the 290x never matches or beats the 980? I have already made it very clear that the 290x was never meant to compete with the 980 in the first place.

If your not biased then why are you picking and choosing the information that you want to accept?

The use of the word 'DESTROY' by you was also great btw. Cause thats clearly how it is between the 980 and the 290x (hopefully your not so delusional that you cannot even understand sarcasm)

With your mindset, I may as well be saying that since the 980 costs twice as much as the 290x, the 980 should perform twice as fast. Or was it 2 times better in every possible way. OH wait thats not how it works. How silly.
Okay, calm down and understand what my original point was - if the R9 390X is a rebranded 290X, then AMD are going to have issues as they won't have anything to compete against the 980 if the Fury is priced, as rumours suggest, against the Titan, not the 980.

So now you're back on track, you can put down the pitchfork. ;)
 
Oh so thats what it is about. Sorry, all I saw was stab stab stab and more stabbing. My bad, I guess the reputable members you were before stabbing was in the wrong, just not you.

You only seem to be acknowledging the cons from AMDs side. So what if they take a coffee or something small off you because of power? Does the 970 not still have a horrible memory config or did they fix it already? OH wait it was invented to be like that. Either way it was already confirmed that the chips would be redesigned.

The 980 is crippled by its meagre 256-bit bus, either way everyones requirements are different, so who are you to say what is and isnt competitive? My triple 980 setup was worse off than my trifire 290x. At first I thought my 980s were just bad due to throttling, however even without any throttling they werent able to handle 4k as well as my 290x setup.

You can OC the 970 and 980 as much as you want, however overclocks wont save you from a hardware fault.

My pitchfork is down now, however if you want to take another stab at anyone, its always close by.
 
Last edited:
Oh so thats what it is about. Sorry, all I saw was stab stab stab and more stabbing. My bad, I guess the reputable members you were before stabbing was in the wrong, just not you.

You only seem to be acknowledging the cons from AMDs side. So what if they take a coffee or something small off you because of power? Does the 970 not still have a horrible memory config or did they fix it already? OH wait it was invented to be like that. Either way it was already confirmed that the chips would be redesigned.

The 980 is crippled by its meagre 256-bit bus, either way everyones requirements are different, so who are you to say what is and isnt competitive? My triple 980 setup was worse off than my trifire 290x. At first I thought my 980s were just bad due to throttling, however even without any throttling they werent able to handle 4k as well as my 290x setup.

You can OC the 970 and 980 as much as you want, however overclocks wont save you from a hardware fault.

My pitchfork is down now, however if you want to take another stab at anyone, its always close by.
You make this place sound like an old man bar where you have a bunch of grumbling old guys who dislike anything different their long held beliefs. It's a forum where we are talking about tech, not stabbing anyone so chill out dude :D

Who cares about the 970 memory config? Does it get the frames stated in benchmarks? Yes. Who cares about the rest then?

If you theink the 980's 256-bit bus is a problem you clearly don't know what you're talking about. The new design in maxwell allowed for it and it can be OC'd to almost Titan X levels. So if the card gets great frames, why is the bus a problem...? :confused:

Also you haven't mentioned anything about if the 390X is a rebrand, then what do they have to challenge the 980??
 
If you theink the 980's 256-bit bus is a problem you clearly don't know what you're talking about. The new design in maxwell allowed for it and it can be OC'd to almost Titan X levels. So if the card gets great frames, why is the bus a problem...? :confused:

Also you haven't mentioned anything about if the 390X is a rebrand, then what do they have to challenge the 980??

You are trying to debate with others yet don't understand how a bit bus works? #facepalm

It most likely won't be a rebrand. It'll probably be improved upon by jumping to a newer GCN, 1.3. It wouldn't be hard for them to do. This has been said about 10x already...
 
You are trying to debate with others yet don't understand how a bit bus works? #facepalm

It most likely won't be a rebrand. It'll probably be improved upon by jumping to a newer GCN, 1.3. It wouldn't be hard for them to do. This has been said about 10x already...

"The memory bus is for assisting, it doesn't actually provide additional performance, it just needs to be big enough to avoid hindering performance. nVidia has never had a card with a 512 bit memory bus, theres a reason for that, it just doesn't matter. The 970 has a 1750 MHz memory clock compared to the 290's 1250MHz memory clock which results in it having 70% of the overall memory bandwidth of a 290 so that helps to compensate for some of it.

They get benefits from a narrower memory bus, the hardware is simpler and cheaper, and there are half as many memory traces to route through the board so there are gains from going narrower if you don't need massive amounts of memory access, and considering how well it is performing even with AA enabled it doesn't seem to matter.

A memory bus is like tires for you car, there is no point putting big wide ones on a Prius that just needs to drive around a city in good weather, its a lot of extra cost that will result in no performance gains if you don't have a good application for them and the power to make use of them."

Yes because I don't know how the bit bus works :)
 
Nah, you do know how to google it though.



Also, I'm gonna leave this here. Double the price for the same fps? Gee boy the 980 sure makes sense for 4k.

71_89_nvidia-geforce-gtx-980-sli-vs-radeon-r9-290x-8gb-4k-amd-fx-9590.png

71_64_nvidia-geforce-gtx-980-sli-vs-radeon-r9-290x-8gb-4k-amd-fx-9590.png

Did you not see the quote marks? No, you're too busy trying to get a 1-up even though the persons point WAS WRONG and I was right. You are some of the most obnoxious forum members I've ever come across too busy trying to defend AMD rather than actually discuss the point like an adult.

So in the fringe case of 4K gaming, it has an advantage (link please, would like to see that it's not just drivers)....But for anyone else, stick with the 970...
 
Did you not see the quote marks? No, you're too busy trying to get a 1-up even though the persons point WAS WRONG and I was right. You are some of the most obnoxious forum members I've ever come across too busy trying to defend AMD rather than actually discuss the point like an adult.

To be honest, you should probably be clearer than putting two quote marks around your post. Don't insult all of the forum members in one big generalization, because what you said simply isn't true. OC3D has some of the most open minded, helpful, and mature members compared to any other forum.
 
Did you not see the quote marks? No, you're too busy trying to get a 1-up even though the persons point WAS WRONG and I was right. You are some of the most obnoxious forum members I've ever come across too busy trying to defend AMD rather than actually discuss the point like an adult.

So in the fringe case of 4K gaming, it has an advantage (link please, would like to see that it's not just drivers)....But for anyone else, stick with the 970...

:lol: mate seriously I dont know what Im talking about? You might want to take a look at my sig.

I have both 980s and 290x's and have experience with them first hand, you have already seen the benchmarks posted, are youstill going to keep saying that the 290x isnt competitive and that the Bus doesnt matter? Ocing the memory is only going to get you so far and obviously every chip is going to be able to OC well right?

Sad really.

So basically what you are saying is that its okay to overlook the short comings from nVidia I.E memory config etc etc but its not okay to overlock AMDs extra power consumption etc etc.

OK you are definitely not biased. So much for laying down the pitchforks.

Oh and what point I made was wrong again? I would really like you to point that out so that there are no confusions.
 
Last edited:
Back
Top