AMD RX Vega 64 and Vega 56 Review

I don't think miners are to blame for anything here. The supply was AMDs issue, not theirs. From what I read, miners didn't jump all over these at all, not even close. The profit margins on power hungry GPUs like Vega would be absolute crap, even if they did mine at higher hash rates. If anything, prospective GPU buyers should be thankful Vega is in such short supply, since that saves you from making a terrible purchasing decision by buying that stupid junk. ;)
 
Supply doesn't seem to be that big of an issue. The initial batches I guess got screwed up during shipments. 10k at launch is quite a bit.

If/when that intial supply lands and we see them stay in stock for a while, then we know it's not miners. If we see them sell out again, we know it is miners. Very few gamers are gonna buy these
 
I can't see a lot of ANY one group buying these. The price puts it out of the "common sense" realm for both miners AND gamers. Unless you *really* hate Nvidia and just want ANYone else, I can't see these things being hot sellers. I almost smell a sort of strategic shortage from AMD, thereby giving the impression that these things are in heavy demand, being sold out everywhere. But that might be the tinfoil hat talking. :)
 
It doesn't really matter, does it? all that will happen is the pig miners will just buy them all cheaper. It actually makes no sense, because now if some one thinks "Hey I can afford Vega and it's worth that much" they can't buy one any way.

Seriously, Vega is like Cartmanland..

"So much fun at Cartmanland, but you can't come !"

I love you for this. I properly laughed.
 
I love you for this. I properly laughed.

haha one of my all time faves that ep. So many Cartman classics..

You bet your fat clown ass it does ! :D Oh wait no, the fat clown ass was from ladder to heaven.

A where were yow? when dey built that ladder to heaven? did you feel like cryin'? or did you think it was kinda gay?

hahahahahaha !!!
 
haha one of my all time faves that ep. So many Cartman classics..

You bet your fat clown ass it does ! :D Oh wait no, the fat clown ass was from ladder to heaven.

A where were yow? when dey built that ladder to heaven? did you feel like cryin'? or did you think it was kinda gay?

hahahahahaha !!!

:p

It also reminds me of the water park episode where any more pee in the pool and the world world explode.
 
Good video from Gamers Nexus talking about HBM vs GDDR5.

https://youtu.be/p9ih5vmcDEk

Apparently GDDR5 would be about $120 cheaper per card for same amount of RAM and it looks like AMD was kinda forced to sticking with HBM for Vega because of power draw. HBM draws significantly less power than GDDR5 and Vega would suck down way too much juice without it.
 
Good video from Gamers Nexus talking about HBM vs GDDR5.

https://youtu.be/p9ih5vmcDEk

Apparently GDDR5 would be about $120 cheaper per card for same amount of RAM and it looks like AMD was kinda forced to sticking with HBM for Vega because of power draw. HBM draws significantly less power than GDDR5 and Vega would suck down way too much juice without it.

The main thing that GamerNexus misses out here is while they are looking for capacity parity with HBM2 what they should be looking for it bandwidth parity. IE, they need enough bandwidth to allow their GPU to run well.

Just costing the capacity is not good enough when it is the Bandwidth that is the limiting factor. There is a reason why Nvidia used GDDR5X over GDDR5. Then also consider the recent reports of memory overclocking on RX Vega performance, IE, AMD likes good memory bandwidth.

Long story short is that pricing needs to be done using a configuration that matches bandwidth, not capacity.
 
I reckon this will all be a storm in a teacup come November time and the AIBs are out, if you're a gamer then Nvidia is your card, if you are more towards compute then Vega will be the card. Personally I will be getting a Vega if only because Nvidia are showing some of Intel's arrogance with regards development and pricing. AMD need to be kept in the game or the two gorillas will tell you to grab your cheeks and bend over while they take a run at you...

I do find it a bit nauseating watching the reviewers get all high and mighty about pricing when they don't personally pay for any of the nice kit they play with. Anyone with multiple 1080s lying around on shelves is taking the P massively tbh (thinking J2C for example)
 
I do find it a bit nauseating watching the reviewers get all high and mighty about pricing when they don't personally pay for any of the nice kit they play with. Anyone with multiple 1080s lying around on shelves is taking the P massively tbh (thinking J2C for example)

Reviewers get all high and mighty because they are simply defending us as a consumer. as for your second comment yes reviewers do get nice kit to play with but they are just products (or samples) supplied by the manufacturers for the sake of the reviews and 8 times out of 10 the manufacturers either want them back or they are sent on to the next reviewer.

As for seeing hardware "Laying around" it's partly by design (a sort of advertising if you will) and it's also so that the reviewer has a catalogue of hardware they can refer to in future reviews. It's not that the reviewer is throwing it in our faces singing look at what you don't have and I get to keep, it's all mine check out my precious like some creepy Golum. I'm pretty sure TTL has covered this a few times in his subscriber videos.
 
Yeah I agree with Wraith. I would rather them argue our case than keep saying "Well I think this £1000 CPU is great value for money !". It starts to get annoying, especially when you have people like Linus and Jay continually showing off with expensive hardware they either got for free or bought with their enormous piles of cash. The computer hobby is expensive and snobby enough as it is !

It's like when Ryzen 3 1200 was reviewed. Most of them just labelled it crap. Custom PC gave it a gold award. That is how it bloody should be, we don't all have Ryzen 1600 cash !

I think Adored said it best. AMD used reviewers. Literally used them to make their product look good then pulled a Frito Bandito number on the prices.

If no one says anything AMD would think they had done the right thing. It's about time we bloody stood up and said "This is BS" because if the reviewers don't what does that leave us? a voice on a quiet forum? a video on Youtube that gets 5 views because it doesn't involve four Titan Xp in Quad SLi?

This whole racket (and it is just that) is rotten all of the way down to reviewers. And it's up to them whether they decide to be rotten or not. I hope Jay gets his call out from Adored about being two faced and it sinks in, I really do.
 
The main thing that GamerNexus misses out here is while they are looking for capacity parity with HBM2 what they should be looking for it bandwidth parity. IE, they need enough bandwidth to allow their GPU to run well.

Just costing the capacity is not good enough when it is the Bandwidth that is the limiting factor. There is a reason why Nvidia used GDDR5X over GDDR5. Then also consider the recent reports of memory overclocking on RX Vega performance, IE, AMD likes good memory bandwidth.

Long story short is that pricing needs to be done using a configuration that matches bandwidth, not capacity.

The point of the video was just about cost. It's so expensive which is why the cards are expensive. There is no data on how to compare the bandwidth equivalent, it's hard to do.
 
Tom, you've hit the nail on the head with this one, I couldn't agree more.

This year was an upgrade year for me, needing to replace my i5-2500K I pre-ordered a Ryzen 7 1800X which meant I suffered for a few months but it was nice seeing AMD stick it to Intel and lead to a complete shift in the market.

I decided to stick to my Crossfire Sapphire R9 290 Tri-X OC until AMD went ahead with Vega and possibly upgrade to single card since multi-card setups aren't getting love any longer. Unfortunately I had to take another road and since Vega didn't deliver on any front when compared to the competition, I went ahead and ordered myself a GTX 1080 Ti. I wasn't even interested in waiting for aftermarket cards. At the prices AMD are charging for reference the aftermarket pricing will be atrocious at best giving Nvidia a clear edge.

Sorry AMD but I don't think you've managed this one correctly. As a neutral buyer on all fronts, I'm passing on this one.
 
Tom, you've hit the nail on the head with this one, I couldn't agree more.

This year was an upgrade year for me, needing to replace my i5-2500K I pre-ordered a Ryzen 7 1800X which meant I suffered for a few months but it was nice seeing AMD stick it to Intel and lead to a complete shift in the market.

I decided to stick to my Crossfire Sapphire R9 290 Tri-X OC until AMD went ahead with Vega and possibly upgrade to single card since multi-card setups aren't getting love any longer. Unfortunately I had to take another road and since Vega didn't deliver on any front when compared to the competition, I went ahead and ordered myself a GTX 1080 Ti. I wasn't even interested in waiting for aftermarket cards. At the prices AMD are charging for reference the aftermarket pricing will be atrocious at best giving Nvidia a clear edge.

Sorry AMD but I don't think you've managed this one correctly. As a neutral buyer on all fronts, I'm passing on this one.

This perfectly encapsulates the general consensus of the 'average' tech-fiend and gamer right now who are neutral in the fight between Nvidia and AMD. Good comment.

After watching Gamers Nexus' video on the cost of Vega, it's clear that the decisions made years ago were detrimental to AMD, both in the past, right now, and for the foreseeable future. They need a huge change or else Fiji, Vega, they are the kind of GPU's we'll continue to get from AMD. Hot, power hungry, expensive to design and produce, slow to the market, only scraping by in raw FPS. I have high hopes for Navi only because of Ryzen's Infinity Fabric. And even then I still think Volta will be more powerful and more efficient because Intel are only behind by a few months (according to Coffee Lake leaks), and likely had the hardware to beat Ryzen but choose to milk consumers for another generation or two.
 
This perfectly encapsulates the general consensus of the 'average' tech-fiend and gamer right now who are neutral in the fight between Nvidia and AMD. Good comment.

After watching Gamers Nexus' video on the cost of Vega, it's clear that the decisions made years ago were detrimental to AMD, both in the past, right now, and for the foreseeable future. They need a huge change or else Fiji, Vega, they are the kind of GPU's we'll continue to get from AMD. Hot, power hungry, expensive to design and produce, slow to the market, only scraping by in raw FPS. I have high hopes for Navi only because of Ryzen's Infinity Fabric. And even then I still think Volta will be more powerful and more efficient because Intel are only behind by a few months (according to Coffee Lake leaks), and likely had the hardware to beat Ryzen but choose to milk consumers for another generation or two.

Indeed Intel got too greedy, I was in the market to upgrade for a year, around when Broadwell-E was released. I need the threads for content creation but the prices Intel wanted were just too steep for me. For the time being I will be sticking to AM4, it's working quite well for me.

In my eyes, Intel got lazy and figured they had this under control. I think there were even some official statements by Intel stating they weren't worried about the situation. Once Ryzen was released it seems like all hell broke loose at Intel and they're still in a continuous scramble and increased release rate to recover. This despite the release issues with Ryzen, it took me a good couple of months to bring the system into stability.

Nvidia on the other hand kept delivering with progress year after year instead of waiting for the competition to deliver something.

Time will tell what the Navi will bring but for the time being. The only fair argument I hear in favour of AMD is "Freesync is cheaper", which it is, but other than that, why bother?
 
Indeed Intel got too greedy, I was in the market to upgrade for a year, around when Broadwell-E was released. I need the threads for content creation but the prices Intel wanted were just too steep for me. For the time being I will be sticking to AM4, it's working quite well for me.

In my eyes, Intel got lazy and figured they had this under control. I think there were even some official statements by Intel stating they weren't worried about the situation. Once Ryzen was released it seems like all hell broke loose at Intel and they're still in a continuous scramble and increased release rate to recover. This despite the release issues with Ryzen, it took me a good couple of months to bring the system into stability.

Nvidia on the other hand kept delivering with progress year after year instead of waiting for the competition to deliver something.

Time will tell what the Navi will bring but for the time being. The only fair argument I hear in favour of AMD is "Freesync is cheaper", which it is, but other than that, why bother?

The Freesync vs Gsync value debate is no longer as valid in my opinion. Gsync monitors may be €200 more expensive than their Freesync counterparts, but Nvidia's GPU's (in the high-end sector) are significantly better. Not only that but they come out quicker and with fewer teething problems; they come out exactly when the consumers want them. That makes Gsync the safer bet. If you have the money, buy Gsync/Nvidia. Don't bank on Radeon/Freesync. That's my personal opinion. I wish I had bought a 980/Gsync setup and then upgraded to a GTX 1080 instead of a Fury/Freesync setup. At the time the Fury/Freesync setup was slightly better value for money and offered more performance, but the 980Ti offered more performance for the money than the Fury X, and the 980Ti has been replaced for months now while the Fury X has been a good card, but not a good purchase. That's a lot of potential time enjoying a graphics card gone to waste. Sadly I couldn't afford the 980Ti/Gsync setup, but if I could then I would have.
 
The Freesync vs Gsync value debate is no longer as valid in my opinion. Gsync monitors may be €200 more expensive than their Freesync counterparts, but Nvidia's GPU's (in the high-end sector) are significantly better. Not only that but they come out quicker and with fewer teething problems; they come out exactly when the consumers want them. That makes Gsync the safer bet. If you have the money, buy Gsync/Nvidia. Don't bank on Radeon/Freesync. That's my personal opinion. I wish I had bought a 980/Gsync setup and then upgraded to a GTX 1080 instead of a Fury/Freesync setup. At the time the Fury/Freesync setup was slightly better value for money and offered more performance, but the 980Ti offered more performance for the money than the Fury X, and the 980Ti has been replaced for months now while the Fury X has been a good card, but not a good purchase. That's a lot of potential time enjoying a graphics card gone to waste. Sadly I couldn't afford the 980Ti/Gsync setup, but if I could then I would have.

So far I have not had the opportunity to try out G-Sync for myself, however I've been sent a Freesync monitor to review (I just started making reviews for the local community here in Malta) and should be being sent a G-Sync monitor as well.

If I had to be honest, I wasn't impressed much with Freesync and if G-Sync performs much better I may end up changing monitors altogether. I don't mind paying the premium if something just works.

What saddens me the most at this point is that I've seen some posts locally coming from "Elitists" looking to buy Vega 56 cards instead of their 1070s to get into Freesync. I don't know what the state of Freesync 2 is but I have my doubts it will be amazingly better.
 
From the majority of people I've spoken to, Gsync and Freesync are pretty much the same for the average user. Gsync is obviously better in every way besides cost, connectivity (Freesync works off of HDMI), and the fact that Intel supports it (not much of a selling point for us), but for the most part they're a very similar experience. If you have a high-end graphics card (1070/980Ti/Fury X/Vega 56 or above) and a 1440p/144Hz adaptive refresh rate monitor, you'd be hard pressed to tell the two apart.
 
From the majority of people I've spoken to, Gsync and Freesync are pretty much the same for the average user. Gsync is obviously better in every way besides cost, connectivity (Freesync works off of HDMI), and the fact that Intel supports it (not much of a selling point for us), but for the most part they're a very similar experience. If you have a high-end graphics card (1070/980Ti/Fury X/Vega 56 or above) and a 1440p/144Hz adaptive refresh rate monitor, you'd be hard pressed to tell the two apart.

Not all of the Freesync monitors use HDMI for it, some of them require the use of Displayport. For instance, the one I'm using right now will only use Freesync via Displayport - AOC G2460PF.
 
Back
Top