AMD's Radeon Navi RX 3080 XT is rumoured to challenge the RTX 2070 for $330

Nope not at all. I'm saying they're going to undercut NVidias pricing rather than take fat margins. There's no doubt these cards will be notably cheaper to produce than TU106 cards (rtx2060/2070),but AMD need marketshare and high volume products in this market, not $200 margins on low volume products. AMD always say every GPU launch, ~$300 is the only price point really worth caring about when it comes to sales.
 
The way the prices are then I am most probably going back to console gaming next year or so. I will ride out the 1080ti and then buy next gen ps5 or so for quick and easy gaming on 4k.

Gaming on pc is great fun but for now the prices are out of hand.

*I doubt we will ever see the new amd card for 330... and here in spain it would be like 100e more expensive than that.
 
if you think you are going to get RTX 2070 performance (currently at about 480$) for 330$, you are dumbass. Literally.

There's only so many times the GPU market can keep knocking out the same sort of performance for the same price. Or rather, the same performance for more money. Eventually what happens is everything goes stale and everything just collapses. It happened in the 1980s and it will likely happen again. Basically the early computers came along, got no better, people got sick of the same systems and then it all just folded.

Also I think a lot of people don't quite understand that Vega is a lot better than they think it is.

Saw somewhere in this thread that the 2060 is faster than the 56. That depends on your testing methodology. This was a problem with the FX CPUs, too. If you compared them with highly threaded apps and doing what they were supposed to be doing you basically had a I7 950 IPC without HT but 8 cores. If you could use them they were good CPUs, and had instructions that cheaper Intels lacked, IE the stuff you needed to run VMs and so on.

In good "proper" DX12 games* the 56 is quite a chunk faster than the 2060. In fact, it nips at the heels of the 2070 more often than not. Most people just put this down to the game being optimised for AMD but I have noticed a large hole in that theory, and that's that basically half of these supposed AMD favouring games are actually Nvidia titles (like BF5, SOTTR and others).

I've also noticed that all of the games that "favour" AMD are also good with DX12. IE - you get more performance using DX12 than you do using DX11. And, to add to it all the games that are better on Nvidia are worse in DX12 and actually perform worse.

What that means to me? that the game has not been coded specifically for DX12 and it's either been patched in or the engine was not designed solely with DX12 in mind. And the games that do perform well on AMD? well, they are all from large software houses who have a shorter dev cycle than others.

*as explained above I think there's no such sorcery as "optimised well for Vega" I think that Vega is simply better than Nvidia's older Pascal and low end Turing cards at DX12. Because after all, Vega is a DX12 architecture that was designed as such ages ago. This is why it gets hot and uses more power, because it's got the balls for DX12.

Which, if you look back at the FX series you can understand, because that was a very forward thinking tech as well. Sadly AMD like to forward think too much, and that hasn't worked very well for them yet. I mean crap, even Nvidia said that the 7970 was AMD's Fermi and they've stuck to their guns ever since.

This is also where the "fine wine" saying comes from, though personally I don't think it's any such thing as the card or drivers improving I think that it's just at some point AMD's risk on future stuff pays off.

If I were not fussed about RT as it sits, and if I was forward thinking I would rather have the 56 over the 2060. Quite simply because when we eventually DO live in a world where everything is DX12 and the DX11 dev cycle has ended and has been put to bed? it's a better card for DX12.

You need to remember, Turing happened for a reason and it was not just RT. Basically Pascal cards were beginning to show their weakness and Nvidia knew that in two gens time AMD would be leading. After Fermi Nvidia made different GPUs entirely. The GTX 280 was a massive hot pig, and so was the 480. So they cut down the core size then realised just how high a smaller core could clock on a die shrink.

Since then? spoon feeding us tiny little dies because DX12 didn't matter yet. AMD on the other hand chose the other path. Keep making massive how GPUs hoping that DX12 would come along and justify it.
 
Yeah that's the thing, Turing dies/cards are massive and expensive because they had to turn them into large relatively unspecialised, general-compute focussed architectures so they would hold up moving forward as more games leant on the additional features & paradigms that become widely used with low-level APIs. Turing is for all intents and purposes, a far less efficient(In terms of transistor count/cost per performance) architecture than Pascal, Maxwell & Kepler when it comes to DX11.

They went that route because they needed a mobile optimised/focussed architecture, Kepler/Maxwell were essentially ground up mobile designs, lets not forget NVidia were trying to get their Tegra chips in a Nintendo handheld for around a decade, originally aiming for the 3DS but they couldn't make the cut in terms of power consumption, and arguably their largest failure for the first half of the decade was their inability to get any real Tegra design wins despite throwing billions at it. You could say its paid off with their laptop market domination though.

Essentially AMD need to make their architecture abit more Maxwell-like(Small, specialised for graphics, so cheaper & more efficient), and NVidia had to make their architecture more GCN-like (Wide, flexible, jack of all trades arch with few/no fixed paths)
 
Last edited:
Fermi just wasn't really necessary at the time. It was hands down the king of folding proteins, but then that was about all it was really excellent at. Kinda like how AMD's cards now are awesome for mining coin, because they are big fat heavy GPUs.

Fermi taught Nvidia lots. It also brought them back down to earth for a while. I still think that Kepler was £ per perf at launch one of the best values Nvidia had ever offered. The funny part is that the 5000 ATI series were everything Fermi wasn't. So whilst one company was struggling to sell its massively expensive tech for decent coin the other was planning to do the reverse lol.

I always knew that when Nvidia returned to the hefty table that things were going to get expensive. And the worst part is that due to the lead they have they have no incentive to drop the price like they had to do with Fermi. I've noticed the 2060 has gone UP in price to £330 for entry level rather than the huge price drops I was expecting.

As I say, something has to give. When people buy a GPU and then two years later look to upgrade it only to find they can't beat what they have in £ to perf they simply won't buy. The crash in the 80s was caused by incremental bumps at £300 a pop. Parents simply wouldn't pay it, thus we ended up stuck for years. God, I got so sick of my Spectrum but there was nothing else to warrant spending on so I could definitely see my mother's point. And when the big systems came (Atari ST and Amiga) by god they were far too expensive for her to afford.

It's funny because I used to sink every penny of my spare cash on PC stuff. Over Xmas I ended up buying top end peripherals. A couple of weeks ago there was literally nothing at all worth buying so I got a PS4. That is how bad it is right now. A life long PC gamer ends up buying his nemesis because things are really that boring.
 
I'm looking forward to seeing what amd has coming but the vega 56 is tempting or should I say 2 of them :D

But Idk if it's just me but does anyone else get the feeling that the 16** turing cards from Nvidia just the rebranded repackaged mining cards I have a high feeling it's what they were going to try and do with it but instead repackaged it into cheap knock offs.

I know GPU's at the flag ship can be high but I don't like nvidias prices atm so AMD is where I want to goto if they bring good value and DXR if they do that they have me sold.
 
Back
Top