AMD release RX Vega and Frontier Edition launch dates

Not themselves, but allow someone else to.

/long post, hope you got your reading specs on lol.

Many years ago (nearly ten, woohoo my ten year anniversary of being clean !) I used to code/design front ends for an emulator. That's about the simplest way I can describe it. I worked alongside a genius, who released fully working emulators he had coded from scratch. We had what we called a "scene" and I used to spend around 25-30 hours a piece drawing, lighting, coding in coordinates and giving these things away for free.

I guess because the top dog coder guy was a close personal friend I was considered his spokesperson.

That, dude, is about where the normal sounding bit of it takes a U-turn.

As I said, I was basically dedicating my life to this "scene". I spent all of my time that my wife was working busting my ass on these things. Consider, each front end could have 250 lamps. That means you need to draw the artwork in Photoshop (or clean up some real photos) and then darken the main image before lighting up 250 lamps manually. That means drawing around each one, going to filter-lighting-omni lamp, changing the settings for each lamp and then moving onto the next. So yeah, 25-30 hours per front end. I calculate that I pulled off around 500. Some were obviously easier than others but yeah. With any spare cash I had I bought resources (artwork on fliers etc). Here's what I got for it.

* People calling me the C word all of the time.

* Only one guy ever did something kind in return for all of my free work. That was during 8 years of it.

* I was constantly being nagged, asked and even threatened to tell people when the next release was coming. And it if was late or my mate decided they were being anuses he would not release it and I would get the full blame. I've had threats, death threats, threats of people breaking into my house and stealing the code and etc etc. I also had one guy phone my place of work trying to get me sacked because I called him a name.

* Private info and beta versions were constantly leaked, defaced and etc etc.

* I was the scape goat. So if my mate had an axe to grind he would make sure no one got any updates and of course, I was the stupid idiot on the receiving end as I was considered the spokesperson.

Now I know, this has nothing to do with AMD at all. However, I have seen human nature at its worst. So yeah, this is emulating gambling machines not graphics cards but it's pretty much the same thing. Dealing with a few thousand aholes who want it all now.

If, for example, you told some one Vega was being hard launched and in stock on the 30th June (just an example date) then they would forget that in about two seconds and start demanding benchmarks. Then they would accuse you of doctoring the benchmarks, only running them in favour of your hardware etc etc..

I PMed Wraith last week when I wasn't feeling too good (thank you for reaching out to me, see how much better that is than banning me?) and said this. I was being a bit philosophical at the time...

Sometimes my wife and I play this game. What you do is go on the internet and visit a forum/Youtube (whichever one you like) and say something that is not 100% either true or technically accurate. Then you start your stop watch and see how long it takes for some knobend to come along and correct you.

Now, wouldn't it be better if people did their real life stuff as quickly as they take care of that?


Or something along those lines. It pretty much goes "You can't please all of the people....".

Now over the years this "scene" I was involved in started demanding more and more from me. It was possible to actually lock and encrypt your front ends. But people whined because they could not change keyboard shortcuts or edit my work into their own and make changes. So I tried to appease them by basically giving them away unlocked. I wasn't happy about it AT ALL because there were several penii being penii but I did it hoping it would fix things.

It didn't. Then it was just something else. Now I used to love that "scene" and I nearly had a nervous breakdown trying to make every one happy. It never worked. There was always some one starting crap or doing something bad to bring it down.

Example. One guy on there proved over several years that he could be trusted about as much as a perforated bucket. He leaked private software time after time after time. Now there was one rule you did not break in this "scene" and that was you don't talk about MPU5 (it was a brand new technology) and you certainly don't leak it out. That would cause serious legal problems for the main coder (not my mate, some other guy).

So what happened? well firstly some one leaked the forbidden tech and tried pinning it on me (even though I had emails that proved my innocence, because I had an older version). Then the same person basically leaked all of the source code to all of it (which was the legal problem for the owner) and then some bright spark discovers there are a couple of hundred lines of code (nothing in the grand scheme) from MAME, so basically dobbed him into the MAME team who then threatened legal action unless he made it all open source (which he had vowed he would never do).

So there, that is what happens when you try and appease the masses. Whilst the leaks involved in PC tech are far less damaging (and tbh sometimes planned IMO) they are still bad if they are real actual leaks. They can paint the wrong picture of an incomplete tech/product etc. Yet still people do it.

Doom 3 - leaked. Half Life 2 MP? leaked. You can sorta understand why Gabe keeps HL3 so well protected....

Human nature dude. All human nature. I've learned the hard way, but hey I have learned. I walked away in 2008 and never looked back ever. I sometimes check the forums (they are mostly gone or very quiet now) and there are some signs of life, but just the same old crap going on that I used to get all of the blame for.
 
I really wanted to wait for Vega but I need my PC to be up and running before it's gonna be released. I'll be picking up an EVGA 1080 Ti FTW3 or a 1080 FTW2. I wish them well and hope Vega ends up killing it.
 
Last edited:
TL;DR? here is a shortened version, penned by a pal of mine this morning.

No, AMD said that they would be discussing Radeon RX Vega at SIGGRAPH in late July/early August - not that it would be launching. Scuttlebutt below decks suggests actual availability will be considerably later, with as few as 16,000 enthusiast-grade units available at launch and mainstream pushed through to 2018.

Can't attest to the accuracy of the latter claims, but can confirm that AMD has most definitely not stated a launch date


Yet every one is now assuming it will launch then.
 
/long post, hope you got your reading specs on lol.

Thanks for the comment. I haven't been a part of the hardware scene for nearly as long as other chaps here have (Ivy Bridge/Kepler/Southern Islands is the time I started getting back into PC's after a few years of hiatus) so I don't know as much as I'd like. It's good to hear about this stuff.

I'm not nearly as jaded as you are, though, as I've not experienced the kind of deception of debauchery you have. I've led a relatively solitary life. I'm a Jehovah's Witness with friends who have looked after me and helped raise me over the last 20+ years and have never let me down beyond the usual human error. I'm also considered disabled by the state so I don't have a full time job. In that way I've not been thrown into the deep end like others have. School was tough, but school is always tough. I can be quite gullible and naive, but also a little delusional to the 'real life' others have to live. I may not like life and have little interest in living, but I don't have a bad life. It's a weird contradiction.
 
https://www.youtube.com/watch?v=W6Digv4mJi8

Cold light of day. TWO Vega GPUs manage 81 FPS with near perfect Crossfire scaling @ 4k. A game that (in recent times at least) is notoriously easy to run.

Remember I called it - 1070 or there abouts.

Thanks for the comment. I haven't been a part of the hardware scene for nearly as long as other chaps here have (Ivy Bridge/Kepler/Southern Islands is the time I started getting back into PC's after a few years of hiatus) so I don't know as much as I'd like. It's good to hear about this stuff.

I'm not nearly as jaded as you are, though, as I've not experienced the kind of deception of debauchery you have. I've led a relatively solitary life. I'm a Jehovah's Witness with friends who have looked after me and helped raise me over the last 20+ years and have never let me down beyond the usual human error. I'm also considered disabled by the state so I don't have a full time job. In that way I've not been thrown into the deep end like others have. School was tough, but school is always tough. I can be quite gullible and naive, but also a little delusional to the 'real life' others have to live. I may not like life and have little interest in living, but I don't have a bad life. It's a weird contradiction.

I visited southern Ireland when I was 16. I had a girlfriend who lived there. I remember at the time laughing at how backward they were and how innocent everything seemed.

Wish I'd stayed there tbh. Mind you, her brother would have killed me if he knew lmao :D
 
If that's all AMD are offering, the Fury X would be considered more of a success. It least competes with the 980Ti now and handily beats the 980.

I still don't know why you keep pressing that Vega will only offer 1070-level performance. It makes zero engineering or marketing sense.
 
If that's all AMD are offering, the Fury X would be considered more of a success. It least competes with the 980Ti now and handily beats the 980.

I still don't know why you keep pressing that Vega will only offer 1070-level performance. It makes zero engineering or marketing sense.

Maybe they are hiding true performance so they could surprise everyone.

Edit: Or maybe it is just not that good.
 
Maybe they are hiding true performance so they could surprise everyone.

Edit: Or maybe it is just not that good.

Yeah, it's quite possible they don't want to allow Nvidia the chance to drop the price of the 1080Ti or 1080 until Vega is out and ready, which by then would be too late for new buyers to buy from Nvidia. Consumers would see that Vega performs better for less money and buy it, leaving Nvidia with no choice but to drop the prices and hope people will still buy their GPU's (which of course they will, but a lot less of them).

Or maybe it is just not that good.

/funnynotfunny
 
OK so the guys broke it down today. They said that at ultra the 580 in Crossfire was X FPS slower with the same settings. With everything taken into account a single Vega *could* be 46% faster than a 580.

AGF - the calculations were done ages ago and that was the prognosis. Given I don't take guff as reality I've just stuck to that. I do believe that at least one of them will be the 1070 equivalent.

Fury X was in no means bad at all IMO. It wasn't much slower at high res than a 980Ti. It was just far too expensive, and didn't have enough physical VRAM. Now had they used some sort of GDDR, made it easily available and nailed the 980Ti to a post when it came to price? it would have been another 290 success.

Instead they made it weird, too expensive, hard to get and had a boat anchor attached to it. Right about now is the time that the pumps are starting to fail it would seem, leaving people without a card and no real way of fixing it. Which is ghey.

And they're using HBM2 for this one. FFS, the memory is not the most critical part. But that will make it too expensive, too hard to get hold of etc and etc.

I wish to god they would just slap some GDDR on it and price it correctly. Then we'd be onto a winner.
 
If they do offer 1070 equivalent for less money it will be absolute boom. That is the most popular segment on the market, and the place for biggest profit.
 
If they do offer 1070 equivalent for less money it will be absolute boom. That is the most popular segment on the market, and the place for biggest profit.

HBM2 is the fly in the ointment. It's a serious bone of contention. Say they launch a Vega that's as fast as a 1070, then need to charge 1070+ money for it.

It's just really, really stupid. Who is going to buy a 1070 for more than 1070 money this far into its launch? All Nvidia need to do is drop the price and AMD are screwed. And they can, because their GPU is much cheaper to produce.
 
OK so the guys broke it down today. They said that at ultra the 580 in Crossfire was X FPS slower with the same settings. With everything taken into account a single Vega *could* be 46% faster than a 580.

AGF - the calculations were done ages ago and that was the prognosis. Given I don't take guff as reality I've just stuck to that. I do believe that at least one of them will be the 1070 equivalent.

Fury X was in no means bad at all IMO. It wasn't much slower at high res than a 980Ti. It was just far too expensive, and didn't have enough physical VRAM. Now had they used some sort of GDDR, made it easily available and nailed the 980Ti to a post when it came to price? it would have been another 290 success.

Instead they made it weird, too expensive, hard to get and had a boat anchor attached to it. Right about now is the time that the pumps are starting to fail it would seem, leaving people without a card and no real way of fixing it. Which is ghey.

And they're using HBM2 for this one. FFS, the memory is not the most critical part. But that will make it too expensive, too hard to get hold of etc and etc.

I wish to god they would just slap some GDDR on it and price it correctly. Then we'd be onto a winner.

46% above a RX 580 is only slightly faster than an overclocked 1070. That means that within a two-year window, with a huge clock speed increase (400Mhz at the minimum), a die shrink, improved GCN design, HBC, 8GB of HBM2, they have only managed a 25% increase over the Fury X. Yeah, I don't buy it. The 400Mhz clock speed alone should be enough to match a 1080.

If it's true, I will have no reason to upgrade, especially if the price rumours of $600 are true. Goodbye AMD. I'll wait for Volta, sell my Freesync panel, buy a 2080 and a Gsync panel. Out of principle I may even wait for Cannonlake instead of going Ryzen. AMD would be clearly incapable of delivering the products I want. I'm not loyal to a fault. I bought a Fury and a 1440p Freesync panel because 1) I couldn't afford a 980Ti and a Gsync panel, and 2) because a Fury/Freesync setup was better value than a 980/Gsync setup and was better suited to the higher resolution. I have no reason to believe AMD cannot deliver something at least on par with those demands.
 
46% above a RX 580 is only slightly faster than an overclocked 1070. That means that within a two-year window, with a huge clock speed increase (400Mhz at the minimum), a die shrink, improved GCN design, HBC, 8GB of HBM2, they have only managed a 25% increase over the Fury X. Yeah, I don't buy it. The 400Mhz clock speed alone should be enough to match a 1080.

If it's true, I will have no reason to upgrade, especially if the price rumours of $600 are true. Goodbye AMD. I'll wait for Volta, sell my Freesync panel, buy a 2080 and a Gsync panel. Out of principle I may even wait for Cannonlake instead of going Ryzen. AMD would be clearly incapable of delivering the products I want. I'm not loyal to a fault. I bought a Fury and a 1440p Freesync panel because 1) I couldn't afford a 980Ti and a Gsync panel, and 2) because a Fury/Freesync setup was better value than a 980/Gsync setup and was better suited to the higher resolution. I have no reason to believe AMD cannot deliver something at least on par with those demands.

TBH fella the very fact they had to use two GPUs to run a relatively simple game is damning. And rather telling..

Surely they would want to boast how one card can run it at that speed?

The 1080Ti, IIRC, runs around 65 FPS at 4k max on one card.

Yeah, 1080Ti this ain't... And the price? lolz.

I hate to say it but the way things are looking right now?I was right all along. Sadly.

If they do offer 1070 equivalent for less money it will be absolute boom. That is the most popular segment on the market, and the place for biggest profit.

Then Nvidia simply reduce the price of the 1080 to the same as the Vega, drop the price of the 1070 down to lower than the Vega and AMD lose. It can't go like that.

What they need is a cheap card. Period, end of story etc. Cheap = victory (see also RX 480). Expensive = risky. Even more risky when Nvidia have been charging £600 for a £350 card for about a year.

They can easily swat aside any expensive effort from AMD and still make a sh**load of cash.
 
Last edited:
Was the Prey demo using Vega Frontier Edition?

Also, the Threadripper CPU might be causing issues with the demo. I know Prey is optimised for Ryzen, but the 1080Ti scores could have been with a 7700K which could be adding frames.
 
TBH fella the very fact they had to use two GPUs to run a relatively simple game is damning. And rather telling..

Surely they would want to boast how one card can run it at that speed?

The 1080Ti, IIRC, runs around 65 FPS at 4k max on one card.

Yeah, 1080Ti this ain't... And the price? lolz.

I hate to say it but the way things are looking right now?I was right all along. Sadly.

I'm not sure that they were trying to demonstrate the performance of RX Vega in that demo, but were more trying to showcase the multi GPU performance of the threadripper platform. Vega was a merely a footnote of the presentation the main focus was all about threadripper and X399.

Having said that, Alien were are you getting these predictions from? I know you've been saying all along "around 1070 performance according to some guys". Are you able to post a link for your sources please? You may have already posted one in some other thread but i missed it. Or are you getting this from sources closer to the source? ;) haha
 
I'm not sure that they were trying to demonstrate the performance of RX Vega in that demo, but were more trying to showcase the multi GPU performance of the threadripper platform. Vega was a merely a footnote of the presentation the main focus was all about threadripper and X399.

Having said that, Alien were are you getting these predictions from? I know you've been saying all along "around 1070 performance according to some guys". Are you able to post a link for your sources please? You may have already posted one in some other thread but i missed it. Or are you getting this from sources closer to the source? ;) haha

I can't link you to it no. It was all said in private email. There's also no point in me telling you the source of it either, as people would either not believe me or I would get some one in trouble.

I have been involved with the PC scene for a very long time. Pretty much solid since 2007, every day. As such I have met a few people who really know their onions. Scientists, mathematicians. Remember - it's all science. You can not defeat solid science. And whenever a company tapes out a product and gives details it's always possible to use science to predict the performance. Clock speeds are dependant, temps are dependant but the rest is all scientific.

There are only a couple of times that they have not called it accurately. Well, they did, but there was a twist in the tale. The first time was Kepler. They predicted how good/bad (at the time it was bad) Kepler was going to be on the given clocks. Then all of a sudden Smoke (some guy who overclocks and was apparently testing for Nvidia) posts on Facebook something about a 30% bios. He's not English, so it wasn't in perfect English. A week or two Nvidia release Kepler and quelle surprise, it is running at 30% higher clocks than even they predicted.

I would surmise that after the fail of Fermi they were very apprehensive about clock speeds. IIRC Kepler was going to clock in at around 600-700mhz. As it turns out it was much higher than that. I guess they won the lottery, or, just got paid back for not making the same mistake they made with Fermi (no tanks here).

So yeah, when a company throws a curve ball like that it's quite hard to predict where the performance will end up. However, reduce those clock speeds to the speeds that were leaked? spot on.

As for the rest? like release dates, news of big things (like memory price hikes?) I've already explained that. I have a mate living out in Taiwan who works for Asus. I uploaded his pics from Computex the other day. Noting of course how empty it was (because the public had not arrived yet) and how he got to meet Linus etc (not the first time :) )

Obviously working for Asus doesn't just mean he gets info from Asus. He is also a journalist and writes columns for magazines, so it's his job to know what is selling, who is selling it, sales figures etc etc.

You don't have to believe anything I say at all. Sometimes it will be spot on, sometimes not. But it's usually true. And I was told 1070 performance (which looking back over a year ago would have been awesome, only the 1080 was better) but now it's looking even worse with the 1080Ti being out there. At that time the 1070 was demanding £450, so AMD could have done a lot worse.

As time progresses AMD are losing ground. And because of HBM2 it will slow them. Remember, big die with the memory on it. If anything is malfunctioning on that die the whole lot has to go in the trash can. It can not be binned or what not for other lower end GPUs because it doesn't function correctly so it's game over. Hence the high price.

And that annoys me greatly. It's like some one saying "Here, look at this cheap car. It's a lovely little car. However, we've put diamond encrusted wheels on it that can't be changed so the car costs £1m".

Was the Prey demo using Vega Frontier Edition?

Also, the Threadripper CPU might be causing issues with the demo. I know Prey is optimised for Ryzen, but the 1080Ti scores could have been with a 7700K which could be adding frames.

Couple of things. Firstly it was running at 4k. I remember once running GTAV on a 3970x in a board with broken power phases so it would only run at 1.1ghz (the CPU, and not 4.9ghz). Any way, I lost about 3 FPS in GTAV. That is how important the CPU was in that game at 4k. Even if the 1080Ti was running with a faster CPU you still have to question why AMD chose that game.

And it's probably because that is the only game in a *really* long time that supports Crossfire "to the metal" as it were. They made sure Bethesda (who they are now in cohorts with) put in the best support and that it works. And they were so proud of it they paid certain websites to "review" the 580 in Crossfire in that game. There was even a disclaimer saying that AMD had done so.

So they are cherry picking their best game with their best CPU to show off. However, if that is the case then why are they using two Vega GPUs to get only 81 FPS? Even if they are frontier editions it doesn't take the scientists among us long to basically decode the video, break it into logic and then figure out how they are performing?

I will send you a link to this "review" so you can see it for yourself. I can't post the link here, Tom doesn't like the place. Be sure to read the comments posted after too, as it's where the 46% comes from (over the 580).
 
Last edited:

Thanks for the explanation man, totally understand that you can't link or reveal your source if it's private correspondence. And I get that the performance for a CPU can be calculated by knowing a few of the specifications, my professional background is embedded systems, which is part of the reason so I was so interested I learning more. Anyway thanks again, as much as I would like for you to be wrong I have a sinking feeling that you are not.
 
Thanks for the explanation man, totally understand that you can't link or reveal your source if it's private correspondence. And I get that the performance for a CPU can be calculated by knowing a few of the specifications, my professional background is embedded systems, which is part of the reason so I was so interested I learning more. Anyway thanks again, as much as I would like for you to be wrong I have a sinking feeling that you are not.

I would also love for me to be wrong. I'm not really waiting for Vega but tbh I was kinda hoping it was where I would go next.

Some one said earlier today on another forum that they reckon AMD have been pre occupied putting Ryzen into the next gen consoles and that may be why Vega has slipped. Apparently the 580 was going to be entry level Vega and would come first, but instead they just rebadged the 480. This could be rather telling tbh. I mean, it makes sense and all. I guess they felt they had to do something.

Pretty obvious though that Vega is their lowest priority right now.
 
Back
Top