The AMD RX 480 may have a reference only launch

I actually welcome the Reference cards as it will be easier to get waterblocks for them over AIB :)

Doesn't really matter. I don't expect any custom PCB cards. Probably not worth it to them. It'll increase cost and take away one of the major reasons to get a 480. I'm sure we might get reference PCB cards that have better components but nothing custom beyond that

Historically Nvidia have got reference coolers on point (although I'd certainly say they've never been amazing). AMD on the other hand...

Personally I think this is a bit daft of an idea, most people don't buy reference cards unless they like the looks or need the blower type cooler. That said they will probably run cooler in Crossfire than your average non blower cooler. I'd not buy a blower card if I'm only running one/using it in a case with decent airflow.

That said, if it is indeed true that the cooler is good, the non ref cards could be excellent toys. I'm building my old ITX rig into a cheap gaming PC for my younger brother and this card was what I had in mind. My 780Ti has lost performance over time rather suspiciously and I'd rather have the same performance from a card that isn't as much of a housefire.

In short, dumb idea but hopefully it doesn't cause as much of a nightmare as the 290X.


Seriously?

I don't really care what cooler it has tbh. I'm getting the 480 so long as it reviews well. I'm GPUless atm. Only thing I want to know is if AIBs improve OC potential. If not then reference it is for me, I won't wait for AIBs if I don't need to.

290X cards were amazing. Just not the reference...
 
Last edited by a moderator:
My decision making is my choice and you don't have to agree with my opinion on it. I didn't ask for an argument, you invited yourself into one. This is a forum true but no one here wants to argue over everything that you tend to do a lot. There was nothing to originally argue over. You can have your opinion. But I didn't ask for it. But I'll drop it.

You latched onto my comment, this one is as much on you as it is on me, it always takes two for an argument. Your decision making is up to you and wasn't really up for discussion anyways.

290X cards were amazing. Just not the reference...

I don't think he doubted the capabilities of the 290x, it's only about the cooler and the reference only release. It certainly didn't do AMD any good back then.
 
290X cards were amazing. Just not the reference...

Everyone knows that the cards were amazing. Especially anyone who owned a 780Ti. AMD's genius idea to sell them as reference only for 6 months damaged their sales quite badly though and the card developed a reputation as a housefire, a successor to the GTX480 if you will.
 
Er no. The vast majority are actually HDMI. Which is superior to DVI. It's pretty dumb to keep using DVI exclusively if HDMI is an option. It's not even like professional monitors still use DVI only. Most have moved on with better standards. Aka HDMI and DP. DVI and VGA need to go.

yeah sure, hdmi 144hz for the win......... retard.
 
yeah sure, hdmi 144hz for the win......... retard.

HDMI 1.3 does 144hz (@1080P)...

No need to be rude, especially when you're wrong.

-edit-
The issue with HDMI and high refresh rate is that it hasn't been a specification, so many companies don't support it atm.
 
Last edited:
yeah sure, hdmi 144hz for the win......... retard.

Not sure what you have against NBD, however if you were to go onto a residential street and ask them what they use for their PC, chances are its going to be HDMI. Only more dedicated gamers would have used DVI and by now would have upgraded to a freesync or gsync diaplay. If you look at the bandwidth, for HDMI and DP vs DVI you can see why it is a good idea to do away with DVI in favor of them.

Nothing stopping you or anyone else who need DVI from using an adapter. ..
 
Everyone knows that the cards were amazing. Especially anyone who owned a 780Ti. AMD's genius idea to sell them as reference only for 6 months damaged their sales quite badly though and the card developed a reputation as a housefire, a successor to the GTX480 if you will.

I agree it did but then Sapphire bought out the Vapor X and MSI the lightning which are awesome coolers on the 290X
 
Asus was just absolutely lackluster with the 290x.

Favourite 290x was defo the lightning.

Asus didn't try at all. They just stuck the 780 cooler on it, which meant only one of the heatpipes really made any contact. You know you've f**ked up when XFX out does you in the cooling department.
 
I would have really like the XFX if the vrms were cooled. Not sure if they actually changed it and cooled it in the end though. Lovely looking cooler it was.
 
Asus didn't try at all. They just stuck the 780 cooler on it, which meant only one of the heatpipes really made any contact. You know you've f**ked up when XFX out does you in the cooling department.

Not the first time Asus screwed up on AMD cards either. The DCUII for the 79xx had poorly seated coolers all the time.

yeah sure, hdmi 144hz for the win......... retard.
Insulting people doesn't add anything to your argument, it only hurts your credibility. Even worse when you are wrong.
 
Well 144Hz on HDMI is a bit shady atm. On the other hand dual link DVI and DP always do the trick. Now calling names in this forum is another story.
 
Everyone (including myself) went nuts for the new Strix 3-fan from last generation, but it's not been without a few glaring issues. My ASUS Strix Fury has been very solid (72°C max inside a case), but if you look at the reviews for the same card on Newegg, it's bombed. Mine cannot overclock at all even with voltage, but it's a relatively quiet cooler with only slight coil whine. When Gigabyte release their G1 Gaming of Vega, I'll probably go back to Gigabyte. Not positive, but I like Gigabyte.

To be more on topic, I'm not in favour of AMD releasing the RX480 as a reference only. I think that will hurt sales. But I also appreciate why they might do it. I feel it hurt sales for the Fury X—I for one didn't buy a Fury X because I didn't want a liquid cooler—but maybe the RX480 is so feckin' good that it won't matter what cooler is on it. The only issue there, then, is the aesthetics. Not everyone wants that minimalist red look.
 
Last edited by a moderator:
Everyone (including myself) went nuts for the new Strix 3-fan from last generation, but it's not been without a few glaring issues. My ASUS Strix Fury has been very solid (72°C max inside a case), but if you look at the reviews for the same card on Newegg, it's bombed. Mine cannot overclock at all even with voltage, but it's a relatively quiet cooler with only slight coil whine. When Gigabyte release their G1 Gaming of Vega, I'll probably go back to Gigabyte. Not positive, but I like Gigabyte.

To be more on topic, I'm not in favour of AMD releasing the RX480 as a reference only. I think that will hurt sales. But I also appreciate why they might do it. I feel it hurt sales for the Fury X—I for one didn't buy a Fury X because I didn't want a liquid cooler—but maybe the RX480 is so feckin' good that it won't matter what cooler is on it. The only issue there, then, is the aesthetics. Not everyone wants that minimalist red look.

I just want the 480. If reference is good and has lots of OC potential. Yay! If not, looks like i'll be waiting. If it's only good for light overclocking but can remain silent, I'll end up getting one for my brother, he wants an upgrade and will pay for it. Good thing as his 280x is getting old(still surprisingly fast though for 1080p)
 
Does HDMI support 1920x1200 @ 60/120/144 fully? I'm loath to get screens at 1080p, I'm too used to the 16:10 ratio over 16:9 (saying that, I have mine on DP anyway, I'm just curious)
 
Does HDMI support 1920x1200 @ 60/120/144 fully? I'm loath to get screens at 1080p, I'm too used to the 16:10 ratio over 16:9 (saying that, I have mine on DP anyway, I'm just curious)

HDMI 2.0 (to be specific, 2.0a I think) should do. The main issue is that while HDMI can have the bandwidth capability, quite often they leave it out the spec. For example, hdmi 1.3 does support 1080p 144hz as the bandwidth is great enough, however its not a full official requirement for the standard, so chances are you'd need something like a Korean monitor to use it and mess with a couple of settings through NV's control panel.
 
HDMI 2.0 (to be specific, 2.0a I think) should do. The main issue is that while HDMI can have the bandwidth capability, quite often they leave it out the spec. For example, hdmi 1.3 does support 1080p 144hz as the bandwidth is great enough, however its not a full official requirement for the standard, so chances are you'd need something like a Korean monitor to use it and mess with a couple of settings through NV's control panel.

In which case, I'm glad I stuck to DisplayPort screens :)
 
Just my experience..

Aside from some very early win10 issues and one game that refuses to co-operate (World of Warships is still pretty early in development) I've not had any horror. Temps are a little higher on the top card (10 degrees or so) and obviously they use more power than a single, however they do look better as a pair IMO.

It's also a good way to upgrade if someone has lower end card atm and only a couple of hundred dollars to spare. Buy one now and buy a matching twin in a few months when they'll probably be even cheaper. A 1070 is a big investment for a lot of folks.

That's good to hear, however i play a lot of older games which most likely don't have CF/sli support. Obviously one card is still by far good enough to run those games, but i've heard that it can cause stuttering, not sure if i'm willing to enable/disable CF all the time.
 
Back
Top