AMD release RX Vega and Frontier Edition launch dates

Yeah, unfortunately for me I am waiting for Vega, I'm stuck on my 7970 that had all but completely failed. It wasn't so bad until I bought an acer X34 now after about 15 mins of running even just in desktop I start getting artefacts. It wasn't an issue when I was only running 1080p. I don't want to get a 1080 because the X34 is a freesync panel so for me Vega can't come soon enough
 
Looking at things again, I'm starting to think that it was simply Nvidia that hit it big and AMD were left clawing back. Our perspective has been heavily skewed. Guys like AdoredTV insisting that Vega will have to beat big Pascal or RedGamingTech suggesting that since Vega is a year later it should theoretically be more powerful than the relevant competition. We've seen what the 1080Ti can bring to 4k gaming and now we expect that same performance from AMD at a cheaper price. The more I look at it the more I believe such a prospect will be impossible to deliver. While my theories are pretty rudimentary, they make sense to me. If they need correcting, please correct me.

OK, so, the Fury X was around 30% faster than AMD's previous flagship the 290X. The 980Ti was about 30-35% faster than the 780Ti. It took AMD just under two years to release the Fury X after the 290X and gain that 30% performance. That has increased over time, but only by about 5-8%. Compare that to the difference between the 980Ti and the 1080Ti after only 20 months. According to TPU's performance summary, the 1080Ti is 80% faster than the 980Ti. AMD can't gain even close to 80% over the Fury X considering the 30% of their previous generation.

Similarly, the GTX 1080 is approximately 30% faster than the Fury X. That means that after two years AMD will have to increase their performance between architectures more than they did last time. That should be possible with a die shrink, but maybe 30% is still all they can do and we've been spoilt by Pascal.

I always said I only ever wanted heavily overclocked 1080 performance, but even that might not be possible, at least at launch.
 
When the 7970 was announced some one said it was AMD's Fermi. I didn't quite get that because it was a bloody good card (far faster than the 580) but the more time goes on the more I understand it. Basically the 7970 was a pretty hefty die, full of GCN stuff that made it hot and power hungry. And AMD's promise was that it would be fantastic in the future.

And in essence? it sort of was but isn't. IE - they were planning far ahead, and it ended up being too far.

Bulldozer was similar. Forward thinking with a lot of cores that were power hungry. Did it work? no. It was far too soon for devs to shift over to 8 cores. So Bulldozer was a flop, even though when threading correctly it could beat a 2600k. And it remained that way. Support never came. Vishera was pretty bloody good tbh. Near on Westmere core performance only without HT. Again though it needed support and because BD was considered a flop support never came and if it ever did it was too little too late. It certainly wasn't good enough for devs to want to spend the time coding in support for it.

So again, money wasted on tech that was power hungry and hot and overall very disappointing. Do I even need to go through what happened with Fury X? not really no. Briefly - hot power hungry with HBM which mattered not one jot in the real world.

AMD need to stop trying to reinvent the wheel and go with what works. Nvidia did it, I don't understand why they would not take heed of Nvidia's success and carry on making kitchen sinks that cost a fortune and oh, "Might be good in about three years time when the rest of the technology has been left in the dust".

They keep on banging on about 4k. It's all we've seen of Vega. Nothing more, nothing less. It's like they stupidly think that the majority are on 4k already or are about to switch. Yeah, see, doesn't work like that. So again they are forward thinking too much, and by the time people actually DO go to 4k this card will be long in the tooth, HBM2 sat there pretty much wasting space. They've just demonstrated to the world that two Vega are needed for 4k, so there goes your expensive HBM2.

I don't get it. They bought in Raja who is Indian. Frugal, careful, knows how to get something done as cheaply as possible like most Indian companies. This fitted extremely well with the RX 480. Cheap, cheerful, easy to produce, cheap cooler ETC ETC.

So why the f**k is Vega the complete polar opposite? that is the part that I don't get.

AMD - throw the kitchen sink in a skip and go back to what you are known for doing best. Cheap, cheerful, fast, best bang for buck. Stop wasting your time trying to make things for a future that isn't certain.
 
I am kinda worried they are not coming out with some sort of comparison video at this point. Normally they have something, even as silly as 480's in crossfire going up against a 1080. Either the drivers are in terrible shape at this point, and it's turning into another 2900XT, or its a really monstrous card they want to keep under wraps. I have my doubts about the last one.
 
I am kinda worried they are not coming out with some sort of comparison video at this point. Normally they have something, even as silly as 480's in crossfire going up against a 1080. Either the drivers are in terrible shape at this point, and it's turning into another 2900XT, or its a really monstrous card they want to keep under wraps. I have my doubts about the last one.

So far they have not even shown less than 4k performance. And they've dodged the obvious too. IE - only DX12, or Vulcan (Doom) or Crossfire.Or games running at 60FPS with Vsync on. Nowhere have they shown us DX11 performance at 1080p or 1440p. They just keep insisting it's a 4k card.

Well if that truly is the case then I hope that 1% of 1% is going to equal enough sales. And that's assuming that that 1% of 1% hasn't already gone Nvidia or will want to pay the high price of HBM2.

Nvidia have nailed Pascal. Even though it probably cost them very little. They have also (very cleverly IMO) made it run on regular old memory. Which reduces costs which means they can increase prices and make a killing. Something AMD can't do with Vega.
 
does anyone have any rough figures of how the RX580 is going in terms of sales? It could just be that AMD is happy making decent profits in the mid range without having to pump heaps of funds into high end niche cards
 
Last edited:
Ha! That's either terrible for AMD because they have supply issues or awesome for AMD because they can't keep up with the demand. :p

That wasn't the point I was making was it?. and not the question asked. However I would say the latter at least they have sold out and their marketing strategy seems to be working for them
 
Last edited:
rather than quote me fella could you please just answer the question asked from a forum member. I don't mean to come across rude but this is tiring. There seems to be an elitist group on this forum and it's kinda boring IMO. Kinda like School" if your not in their circle " you get talked over . well if thats the case i'm out /end

Uh what? I was elaborating on your post? Am I not allowed to do that now? And as for the quote, I don't see a question in the post. So what am I supposed to be answering? Because if that is the issue you have, I'll try my best at answering it.
 
Last edited:
It's the mid-range market that earns the highest revenue for GFX card manufacturers oh and BTW look at OCUK stock :D http:/https://www.overclockers.co.uk/pc-components/graphics-cards/amd/radeon-rx-580

Your right, The mid range card is the biggest market segment for most suppliers. I have worked for a few that have deliberately restricted the sale of new mid range cards and limited supply to drive the demand and sell more higher end cards.

The high end cards are interesting to a lot of enthusiasts because they give an indication of what might be possible with the mid range cards at a price thats palatable to a wider range of consumers, lets face it most people dont have £800 to spend on a new GPU. The 1070 was an exciting card because it gave last gen high end performance for a great price. I hope that AMD can saturate the mid level market segment with great cheaper cards so that Nvidia will then look at its pricing structure and the playing field will be a bit more even.

Computex is great for seeing new products like RX Vega and frontier but it often doesnt give you any idea of the final production models performance when the drivers have matured.
 
Yeah, unfortunately for me I am waiting for Vega, I'm stuck on my 7970 that had all but completely failed. It wasn't so bad until I bought an acer X34 now after about 15 mins of running even just in desktop I start getting artefacts. It wasn't an issue when I was only running 1080p. I don't want to get a 1080 because the X34 is a freesync panel so for me Vega can't come soon enough

I'm sorry, but what? :huh:... The Acer Predator X34 ultrawide, 34" 3440x1440p monitor is not a Freesync monitor, it's G-Sync.

https://www.acer.com/ac/en/US/content/predator-x34-series

Super-smooth
Get the buttery-smooth gameplay you've dreamed of. NVIDIA® G-SYNCTM eliminates screen tearing and minimizes stuttering for legendary PC gaming.
 
Last edited:
I'm sorry, but what? :huh:... The Acer Predator X34 ultrawide, 34" 3440x1440p monitor is not a Freesync monitor, it's G-Sync.

He is talking about the XR341CK. Which is a Freesync monitor that has a 40-75hz range I believe. Or 30-75. Not quite sure. So what he meant to say was XR34 instead of X34, but I was also confused like you. I had to Google it to find a 34in Acer FS monitor
 
Uh what? I was elaborating on your post? Am I not allowed to do that now? And as for the quote, I don't see a question in the post. So what am I supposed to be answering? Because if that is the issue you have, I'll try my best at answering it.

My apologies NBD I totally misread your post "I broke my monitor glasses by falling asleep with them on so I have to use my close work ones for the time being"

Anyway on topic you could be right about the miners buying all the 480/580 cards up. I do hope Vega will be a success for AMD, as for me I don't early adopt anyway. I normally wait a year after they are out and buy one at a cheaper price. I was going to wait for Vega to come out before upgrading but it's too far away for me and having a 13 year old daughter that wants clothes every 5 minutes gets expensive :D.
 
He is talking about the XR341CK. Which is a Freesync monitor that has a 40-75hz range I believe. Or 30-75. Not quite sure. So what he meant to say was XR34 instead of X34, but I was also confused like you. I had to Google it to find a 34in Acer FS monitor

Yeah spot on, small typo on my part :S
 
TBH I thought the reason I could not find any 5 series in stock was supply issues. But I thought it was a bit strange because it was easy to get hold of the 470 and 480. Didn't realise the mining craze was back but good for AMD. That also explains why that crazy board that Asrock are going to release was shown haha.
 
He is talking about the XR341CK. Which is a Freesync monitor that has a 40-75hz range I believe. Or 30-75. Not quite sure. So what he meant to say was XR34 instead of X34, but I was also confused like you. I had to Google it to find a 34in Acer FS monitor

Ohh right, I see... Yeah, I was like "what is he on about here? :huh:..." :p
 
They have also (very cleverly IMO) made it run on regular old memory. Which reduces costs which means they can increase prices and make a killing. Something AMD can't do with Vega.

That is one question I have asked, and have not gotten an answer on. For a normal chip that is not using Infinity Fabric to stitch multiple chips together, the newest GDDR5 would have been plenty and lowered the cost. When you get to using IF and stitching parts together, then HBM makes sense because you want the speed just like in Ryzen.

I think the main reason we are seeing it now is because AMD paid for all the RND on HBM1, and got first pick at the Fab for the launch of HBM2. They were able to soak up most of the inventory to keep it out of Nvidia's hands. I have not seen any numbers, but I also think they got a bit of a price break for the money they invested in HBM1.
 
Back
Top