AMD R9 290X Review

As I look back the thread, I see you've done your fair share of anti-AMD ranting. Whatever dude, I'm not going to argue with you, you're entitled to your opinion like everyone else, I ain't going to attempt changing it.

Anyway, I stand by my prediction of some nice chunky Nvidia price cuts. :)

did you see TTL's last video? there he explains pretty well why the 290x vs titan argument is utterly pointless. and now that he mentioned that there probably won't be any proper third party coolers until 2014 i really don't see a reason for putting the 290x over the 780, because that is already close to the nvidia 8xx release.
AMD pretty much wrecked themselves with that stupid reference cooler only release just to put themselves on even with nvidia.
and if you really read my past posts then you will know that i had both positive and negative experiences with AMD GPUs, so i was unbiased about it till i read the review.
 
What the hell do AMD mean it utilises every watt? That doesn't even make sense, if it used every single watt, there would be absolutely no heat dissipated, it's simple conservation of energy, that heat has to come from somewhere.
 
Thank you TTL for the review and for getting up early to get it out to us. After reading and watching the reviews for this card, I feel let down by AMD. Before I say anything else let me say that most of the GPU's I've owned have been AMD cards. In fact I'm in the process of selecting parts for a new system. But I have to factor in many things before I'll purchase anything.

I'm a single father of one and also a caregiver for a family member, so I don't have funds to just go out and get whatever I'd like to have. My current system is 5yrs old and I have a 5750 in it. So for me here are the pro's and cons of this card.

Pros:
Performance: I use at this time a 32in Sony bravia as my monitor and my TV. I play a few games and I like the ability to play on a large screen.

Price Point: Yeah, who wouldn't like to have a great card at a low price. Especially in my position.

Cons:
Heat: Since I have to be careful with what funds I have. I don't really get the chance to water cool my computer, even though I'd like too. So except for a AIO cooler I use air cooling. I don't want something in a system that's going to put out that much heat inside a case.

Looks: When I build a system I do want the items I select to look nice. Not something that just works. So unless the 3rd party venders are able to get a hold of this card, I'd say no to having it even if given to me.

Again thank you for the review Tim and I totally agree with you. Unless the Venders can fix the heat and looks of this card, I'll just get a 780.
 
Last edited:
What the hell do AMD mean it utilises every watt? That doesn't even make sense, if it used every single watt, there would be absolutely no heat dissipated, it's simple conservation of energy, that heat has to come from somewhere.

Maybe AMD expects you to use the R9 290X to run a steam turbine ;)
 
if you really read my past posts then you will know that i had both positive and negative experiences with AMD GPUs, so i was unbiased about it till i read the review.
So now you're admitting having a bias? :p

As a BTW, as for the 800 series, you previously mentioned 6 months. However if you're to believe Mr Demerjian of Semiaccurate, it's the end of next year. (He links to another article behind their paywall)
http://semiaccurate.com/2013/10/05/much-nvidia-pay-origin-pc-drop-amd/
Probably because AMD is about to launch the new R7/R9 cards in the same time frame and Nvidia has literally no answer until Q4/2014.
Take it with a grain of salt needless to say, as he's been wrong before, but he's probably a semi-more credible source. :p
 
So now you're admitting having a bias? :p

As a BTW, as for the 800 series, you previously mentioned 6 months. However if you're to believe Mr Demerjian of Semiaccurate, it's the end of next year. (He links to another article behind their paywall)
http://semiaccurate.com/2013/10/05/much-nvidia-pay-origin-pc-drop-amd/
Take it with a grain of salt needless to say, as he's been wrong before, but he's probably a semi-more credible source. :p

obviously i am not unbiased anymore, i built up an opinion, that's what happens when you read a review.
and i don't think i'm going to believe someone who is working for a company named "semiaccurate" :D
how the hell is he even supposed to know that nvidia won't have an answer to a series which's specs aren't even in the works yet because the current one hasn't been completed yet. or does he mean the current r7/9? that would completely wreck his credibility because the current AMD series is only on par with what nvidia released half a year ago, so if they haven't BBQ'd all summer they should already have something new.
 
Had to be done...
1525248_orig.jpg
 
i cant believe some fan boys, a guy had posted on an OC3D facebook status. he said an i quote "u will never find any AMD gpu in any workstation", erm Tom what does orca have in it again hmmmmmm
 
i cant believe some fan boys, a guy had posted on an OC3D facebook status. he said an i quote "u will never find any AMD gpu in any workstation", erm Tom what does orca have in it again hmmmmmm

And the Fire Pro cards by AMD specifically designed for workstations
 
yeah i know mate i pointed it out not sure if he will see it though, as i couldn't tag his name in it for some reason

idk what he considers a workstation, i guess a workstation just depends on the software you are running on it. but ye still, the AMD fire pro cards seem to not exist for that guy. i guess he is on the cuda render trip.
 
I hope we see after market versions before the end of this year, however unlikely it may be.

The performance is very good, but the cooler is a major turn off.
 
tomshardware guru3d anandtech techreport al compare this card to the titan beacsue 780 isn't even near (i mean the 290x is more on the level of titan and beating it most of the times on high resolutions , and never looses from 780 but maybe i looked to my games or games that i would play) the 290x , how come this review says otherwise

i have played battlefield 4 beta and with 4x msaa on ... the game uses 3,3gb vram on my 3x 1920x1080 monitors - i play on 1 7970 6gb - and that was the shanghai multiplayer map - the only map in the beta - 1 of 10 mp maps of bf4 , so i do not know if there are even bigger maps which need more vram. So when you go for nvidia you can't go to the 780 with its 3gb or the upcomming 780 ti altough some sites say it wil get 6gb vram just like the titan , speaking of which ... that card can play bf4 on 3 monitors , the only alternative from nvidia is the 770 4gb versions or you can even go lower and pick up a 760 4gb version , but i would not recommend 256bit memory (which is in the 760 and 770). So it looks like the only nvidia option is the titan. amd had the 7970 6gb which has 384bit memory or this 290x 4gb 512bits , i havent seen a 280x with 6gb , nor do i know how much ram the 290 will have almost certainly also 4gb.

All those sites pitch this 290x against titan , even with this cooler it wins from it , so with better coolers it would be even more easier to choose between this card or an nvidia alternative.
 
Last edited:

Wow, first post and already so constructive!

In case you haven't noticed. Toms point was that the R9 290X was aimed at the GTX 780 which actually does outperform the Titan in games under certain circumstances. So just because the 290X beats the Titan sometimes, doesn't mean that in these same areas it will also beat the GTX 780.
 
Last edited:
obviously i am not unbiased anymore, i built up an opinion, that's what happens when you read a review.
and i don't think i'm going to believe someone who is working for a company named "semiaccurate" :D .
Good for you.
 
Last edited:
I'm hoping that Sapphire or another company who knows what to do with cooling can sort the 290x. I'm presuming they will be able to, although it might be quite extreme (the first out of the box AIO cooled 'card anyone?).If not, Gainward Phantom 780 here I come!
 
I fail to see why Tom is taking so much flak for this review. He gave the card a performance award. Yes, he spent entirely too much time harping on the crappy cooler, but he still made legit points. He just should have taken less time on said points. :)
 
The way I see it Mantle wont replace DirectX, it can't (technically it could, but other factors mean it can't).
Someone mentioned that G-Sync costs money but Mantle is free. Mantle is only free if you have an AMD card, otherwise you have to buy an AMD card. If you have an AMD card G-Sync isn't really an option, unless you buy an Nvidia card. Even then you'd need to buy a G-sync Monitor regardless of what card you own.
G-Sync does have the advantage, if it doesn't need to be specifically supported by the game. Mantle has to be coded into the game along side DirectX. It doesn't need DirectX, but the developers (or rather the companies employing them) do. Your game sales wont be great if you exclude every Nvidia card, every Intel 'card' and the large number of AMD cards that don't use GCN. That's eliminating a large number of your customer base before you even take into account who actually want to buy the game. Mantle could only replace DirectX in a world where such a large majority of customers own a recent high(-ish) spec AMD card that the games company doesn't care about the remaining people.
Personally, and I say this owning 3 GPUs that support Mantle, I hope that Mantle quickly dies a death and some real work is done by all GPU manufacturers to OpenGL as powerful and efficient as possible.

Regarding the CPU aspect of gaming and how games are using more threads. The thing is if a game uses multiple threads but still uses a single thread for the 'main' process and uses other threads for background tasks or such then you still need good single threaded performance on the main thread. If Mantle can balance the threads so that usage across cores is more even that that should help.
Also, anyone saying AMD CPUs, specifically Bulldozer and Piledriver CPUs, are good for gaming obviously haven't played GW2. That game hates those CPUs. My Phenom II X6 1055T @ 3.8GHz outperformed my FX-8350 @ 4.5GHz by probably close to 50%. My 4770K @ 4.3GHz is well on it's way to giving me twice the FPS of the 8350.
So guess it depends what games you play! I believe the original Crysis only uses 2 cores/threads. And yes, I sometimes like to play games that were released more than 4 days ago, so shoot me! :)

As for the 290X, the cooler does seem off-putting. However, after the 5870 I decided not to buy AMD cards close to launch. Not because their release prices are unreasonable compared to the competition (except maybe the 7990), but because AMD seem much more willing to drop their prices. So I'm going to (hopefully) run 2 or 3 7950s for a little while and see what happens. Maybe the 290X will have some better 3rd part coolers then and even more reasonable prices. Plus we might see what effect Mantle has and hear more about what sort of support it will get.
 
hrm, I'll check crisis's core usage next time but as I said earlier it runs fine so it isn't an issue for me. The same goes for the older games I mentioned. I probably should mention that I play tf2 more than anything else so I'm not someone who only plays newer games. Also older games= less intensive so it never really is an issue anymore, atleast in my experience. Defiantly right about not buying AMD cards on launch though, the price drops of my current card still haunt me in my sleep :D .
 
Back
Top