AMD Releases Additional Benchmarks for their Radeon VII GPU

It's a decent increase but it's just not good enough. I expected better. Now had they priced it at $600 instead of 700 that would had made a good difference for sure, but right now it seems to me that AMD is content on safely being no.2.

And what the heck is up with that cooler design?! It's certainly not winning any beauty contests and I have my doubts about its efficiency with much of the sides being closed off
 
There are a few ways of looking at this.

1. It's very poor.
2. AMD finally reached 1080ti performance, but at a price.

And etc. I quite like the cooler. Reminds me of the 7990. What I do not like is the price, because for gaming that massive increase in HBM bandwidth will have little to no effect.

Like I said before, the sooner Vega goes away the better for AMD. It wasn't very good from the off.
 
It's a decent increase but it's just not good enough. I expected better. Now had they priced it at $600 instead of 700 that would had made a good difference for sure, but right now it seems to me that AMD is content on safely being no.2.

To be fair, being safely no.2 in a small fraction (IE the high end portion) of the PC gaming market is far from a big deal when they've got their fingers in so many pies(Some a lot more profitable and most reaching much larger audiences). They're still by a very long shot the most used GPU manufacturer for gaming(And will continue to be until at least 2025-2030 by the looks of the next gen consoles) so it makes sense they're not targeting the lowest volume portion of the gaming market with any big hitters.
 
Last edited:
From the FPS numbers it looks like most of them are at 4K ~ultra settings (FO76 is a particularly VRAM heavy game from what I've heard which may explain that jump)
 
And are these numbers in 4K or in what resolution exactly? Can't find the answer in the article.

AMD specifies 4K Ultra settings for Strange Brigade and Battlefield V. Details for the others are missing from the press release.
 
The 1080Ti was launched at $700 on March 10, 2017 and ended up being roughly 20-30% faster than a Vega 64 (with a few outliers).

Radeon VII will be released Feb 7, 2019 for $700 and will be roughly 20-30% faster than Vega 64 (with a few outliers).

Underwhelming indeed.
 
Was anyone actually expecting AMD to have a flagship to compete with 2080ti? Lets be realistic, they are still getting over the the past 5 years of having no money..

Most people were expecting AMD to come out whit a mid-tier Navi card, so I'm reasonably impressed they just dropped a high-end card without the silly hype train of the first Vega.

If the Radeon 7 was dropped at £600, I would be 100% game, the value proposition would be there, so the price is a little disappointing.. But at least AMD can compete with 90% of Nvidia's product stack... by all means spend £1.2k on a graphics card though lol
 
Was anyone actually expecting AMD to have a flagship to compete with 2080ti? Lets be realistic, they are still getting over the the past 5 years of having no money..
No, but AMD has occasionally delivered decent price to performance. Like $500 for a 2080 competitor would've been a game changer. But obviously that 16GB doesn't come cheap and you do need that for the bandwidth. Like, if they added compression tech to Vega along with the node shrink and could have done away with 8GB, well that could've been interesting.
 
Last edited:
16Gbps/pin GDDR6 should be available from both companies by the time Navi launches, that would give a traditional 256-bit bus for their upper-midrange(X80/70 tier) cards 512GBps and allow 4GB or 8GB of memory, IE exactly half VegaII, slightly more bandwidth than VegaI/RTX2080, and exactly double the bandwidth of RX480/580. I don't really see any other logical memory configuration for a top end Navi part assuming it's a Polaris 10 replacement, presumably the 70 tier card would use 14Gbps memory instead and a Polaris 11 replacement would maintain the 128bit bus for half that. Essentially, GDDR6 will allow bandwidth to naturally double at each product tier anyway. (Though compression tech still progresses alongside too to make better use of the bandwidth)

Of course I'd expect the RTX cards to also get a bunch of new 16Gbps SKUs once those GDDR6 modules are available, presumably around midway through the year.
 
Last edited:
Was anyone actually expecting AMD to have a flagship to compete with 2080ti? Lets be realistic, they are still getting over the the past 5 years of having no money..

Most people were expecting AMD to come out whit a mid-tier Navi card, so I'm reasonably impressed they just dropped a high-end card without the silly hype train of the first Vega.

If the Radeon 7 was dropped at £600, I would be 100% game, the value proposition would be there, so the price is a little disappointing.. But at least AMD can compete with 90% of Nvidia's product stack... by all means spend £1.2k on a graphics card though lol

I think if they released a 8GB version it would probably be cheap enough that the lower price point and sell more.
 
I think if they released a 8GB version it would probably be cheap enough that the lower price point and sell more.
They can't make an 8GB version of Vega 7 without cutting off half the memory bandwidth, controller, interposer Ect. With 4 banks/4096 bus width of HBM2 you have 16GB-32GB options using current stacks. If they'd have stuck with only two stacks they'd be stuck with the same bandwidth as Vega1 while having hungrier cores and lots of bandwidth intensive uses nerfed.
 
Last edited:
They can't make an 8GB version of Vega 7 without cutting off half the memory bandwidth, controller, interposer Ect. With 4 banks/4096 bus width of HBM2 you have 16GB-32GB options using current stacks. If they'd have stuck with only two stacks they'd be stuck with the same bandwidth as Vega1 while having hungrier cores and lots of bandwidth intensive uses nerfed.

No. You simply use less memory per stack.
 
No. You simply use less memory per stack.
As I said, not possible(I mean, we wouldn't be having this discussion if such a common sense solution was possible). HBM2 ships in a minimum stack height of 4-Hi (I don't think anyone actually made any 2-Hi HBM2 beyond demonstrations, it's certainly not in any kind of volume production, and it seems to have been dropped from the HBM2 spec entirely), it's more or less necessary to keep the 8 channels well fed, which means a minimum of 4GB per stack, which means a minimum of 16GB if you want a 4096-bit bus.
 
Last edited:
Was anyone actually expecting AMD to have a flagship to compete with 2080ti? Lets be realistic, they are still getting over the the past 5 years of having no money..

Most people were expecting AMD to come out whit a mid-tier Navi card, so I'm reasonably impressed they just dropped a high-end card without the silly hype train of the first Vega.

If the Radeon 7 was dropped at £600, I would be 100% game, the value proposition would be there, so the price is a little disappointing.. But at least AMD can compete with 90% of Nvidia's product stack... by all means spend £1.2k on a graphics card though lol

I wasn't expecting a flagship to compete with the 2080Ti, no. I wasn't expecting a $700 GPU that won't actually cost $700 (just my prediction) to equal another overpriced $700 GPU either (that also doesn't actually cost $700). The issue isn't just AMD or Nvidia though. A lot of it has to do with currency conversion rates, expensive coolers, new taxes, and dealer markup (Brexit and Trump basically). $700 GPU's have been around for a long time. But back when the 780Ti was released, $700 was £550-600 (or about €650-700). Now a $700 GPU would cost me €800-900. Add in the huge price increases in place for higher-end models (3 fans) that I would be more interested in and the card is no longer even remotely good value.

If Radeon VII actually came out at £600 with AIB partners releasing their versions for £600-700, that wouldn't be terrible. I mean, there was a period of a couple of months when you could get a 1080Ti for that money, but obviously Pascal is old tech and Radeon VII will be the better card. But £600 won't happen. Neither will £650. Or £700. Or possibly even £750. I expect initial OCUK prices to be closer to £800. That's what most RTX 2080's cost. AMD sees Nvidia getting away with it and so they likely want a piece of that pie. They'll sell quite well. If 2080's are selling at that price/performance ratio, Radeon VII will too.
 
The issue isn't just AMD or Nvidia though. A lot of it has to do with currency conversion rates, expensive coolers, new taxes, and dealer markup (Brexit and Trump basically).

While Trump may have hurt the prices, everything else is getting cheaper so really it evens out for us Americans, though I do not believe same can be said about Brexit and whatnot.
 
While Trump may have hurt the prices, everything else is getting cheaper so really it evens out for us Americans, though I do not believe same can be said about Brexit and whatnot.
Trumps trade war with China increased the cost of finished electronics SKUs across the whole globe due to how much of it flows in between the two countries during tthe production process, with the BOM cost of some parts prior to manufacture rising 25-30%.

But yeah, Brexit has led to both price increases and a devaluing of the £, amongst other things, so that certainly has the larger impact this side of the atlantic.
 
Back
Top