Watch's AMD's Radeon RX 6000 Series Reveal Here

Slightly turning down some badly optimised settings you can't notice the visual effect of anyway means you should buy a completely different class of device that could very well not meet your requirements out of a PC either in game support, setup, or other uses?

I think it has been long enough now for the high end expensive cards to be 4k. All of them. Especially with two 4k consoles on the Horizon for much less money.

Their "as fast as a 2080Ti" was fun and all, but sadly it's only as fast at 1440p. Not 4k. For that you need a 3080, and the price jumps by £300+.

Like I have said, in my opinion this card should have had more VRAM, been fully 4k capable now and for a while after the next gen slop starts to arrive, with double the minimum spec (because that is what has happened in terms of console power). So for a while we've been pretty spoiled because the consoles only had something like a 580ish in them. Maybe closer to a 570? but either way I am sure you get my drift. IE, minimum specs for PC gaming have pretty much been in line with the consoles, only on a PC you have to step down a resolution or two in order to do what the console is doing. Which is all fine.

However, you then take that next level performance of the XBSX? and you pretty much double down on all of those specs, because that is what the XBSX and PS5 do over their older counterparts.

What do you think will happen in the world of PC gaming then? Do you think they will hold back on their console versions to spare PC owners? or pump in hundreds more hours to make games more efficient on a PC? will they f**k.

"Console Ports" (which every one knows they are not, if they know how they are actually coded) used to be total gash. Like, when the XB360 was around the "ports" being handed over were absolutely bloody awful. Stuff like GTA4 which wouldn't even run properly on Quad SLi, and stuff where they used to release the texture packs after because they crippled performance (Sleeping Dogs, Crysis 2 etc) were terrible.

Now that all improved when they switched to X86 consoles, which has seen a massive uplift in PC gamers. Mostly because the games don't run like doodoo any more. However, when you double the cost of entry in performance terms? then you can expect to see many cards (like the 580 etc) all falling by the wayside.


And, IMO (and many others) 8gb VRAM is not going to be enough. If you took every game into account as I type this? it's already not enough. That is not going to improve.
 
If 11G is enough, but 8G isn't, dropping textures from Ultra to Very High is enough to compensate. And the difference in visuals is negligible. Bying a mid-range card and being unwilling to drop settings from "all max" is bone-headed and also the reason we can't have future proof games such as Crysis was in terms of graphical settings. Though Crysis got punished hard by CPU single-threaded performance increases not continuing to improve and as such, became limited by archaic graphics API.

Not to mention the 6800 won't do 4k60 maxed across the board either, but you can stick to ultra textures. Which is something, but doesn't make or break a card.

But they're both definite 4K capable cards, and choosing between them is likely a question of how important is DLSS and RT performance to you. But reviews will make us wiser in that regard, maybe there will be caveats such as running a system which isn't capable of PCI-E 4.0.


Edit: How can you link reviews where the aggregate performance is the exact same as 2080 Ti at 4K and then call the card not as fast? :lol: https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/35.html
 
Last edited:
Looz I am not disagreeing with you mate. At all.

I'd just prefer it to have had what I consider enough VRAM to actually do 4k. Like, 10gb and the actual selling price of £529 would be fine IMO. Or, even £550. I really wouldn't mind paying a small amount more for much better support and longevity of the card.

So, it should have been 3070 10gb. 3080 12gb, 3090 whatever the hell they like.

When AMD are equipping their low end cards with 12gb? that says a lot. They will know much more about what next gen titles are going to do on a PC. 10gb on the 3080 is more than enough... For now. That will very probably change over the next couple of years.

If you want the card for right this minute, and don't care about having to dial back settings, and will buy the very next card Nvidia line up for you? then fair play. Hats off to you. However, I think they have done this deliberately and it's a shame. That's all. Many seem to agree with me too. Why have the horsepower needed for 4k yet stop it going on for long because it just won't have the VRAM.
 
But who in the real world actually forces themselves to play all games at max settings?

Surely "Not a 4K card" means "You can't have a good experience at 4K output res with this card and contemporary games" rather than just "It takes more of a performance hit on 4K than some other over provisioned cards do"


I play all my games at max settings but then I play at 3440x1440 and not 4K :D
 
I play freak widescreen at 3840 x 1200 now, LOL. :D Slightly less pixels than 3440 x 1440. I've run the gamut from adopting 4k early (in 28" 60hz form, ugh), to ultrawides, and I think 3440 x 1440 in HDR would be the perfect 'sweet spot' for me, and these new AMD GPUs will eat that no problem. Heck with 4K! I cannot wait to see what one of these puppies will do with a proper water block.
 
I am biased but unless you need dlss, there is no reason in my view to buy an Nvidia card.

16gb vram on all 3 cards, 12gb on the lower end and the real low end most likely 8gb.

My mind was made up weeks ago but now it's even easier to stick to my choice, i'm most likely going for the 6800XT early december, my expectations of getting one are pretty low i do feel they will sell out same as nvidia atm, maybe january time i might have better luck unsure atm.

I don't think either brand is a bad choice everyone wants different things, but personally i'd much rather have 16gb vram and know it will have some legs on it for a fair few years, than try to get a nvidia card with far less.

I think nvidia really dropped the ball this generation, can see them racing to get a 4000 series out the door in the next 6months, it's not that they are bad cards now it's that in maybe a years time they wont be.
 
Unless Nvidia are already in an agreement with TSMC (which I am sure we would know about) then I can't see how much they can do. Once taped out it usually takes two years. Whether they could fast track something? especially with what's going on? god knows.

It's definitely the hare and the tortoise all over again. I just hope now that AMD can stay on this path.

I would wait for reviews, as the 6800 may overclock very well and be like the 5700 when compared to the XT.

If AMD have had room for some more in the tank then that could be a very interesting card.
 
I've not written off the 6800 but the 6800XT is a lot closer to the top only 8cu's less and the 6800 is 20 less, i guess it really depends on how much of a difference that makes, we'll see in the reviews, but i feel what might set them apart more is the DXR side of things.

I kinda feel Nvidia already had something prepped for tsmc before they switched to samsung, no real way of knowing, but they sure have stiff competition now.

Idk how much will be in the tank as you put it but the power is low enough that there must be some headroom.

I kinda see it at least in specs the 6900XT and the 6800XT are the 5700XT and 5700 they are very close in specs to me that makes more sense, thou i guess yeilds are speed binned so the 6900XT is the best chips they can make atm, really does seem that they are all very similar and just speed binned even the 6800 being part of that process.

so it's kinda like 5700XT 5700XL 5700 kinda looks like that if you get my point.
 
Last edited:
Overclocking on the reference cards might be a tad bit of an issue if it's drawing 300 watts maximum on 2 8 pins, which leaves about 75 watt of power through the pci-e connection, and that doesn't give a lot of headroom. 1 8 pin connector is good for 150 watts so yeah it might choke quickly.


On aib cards with possibly more power plugs there is no doubt it could do a lot better if the silicon has room to give it.
 
I don't think so. I gave the short answer before but the longer answer is more in depth.

Firstly it obviously has double the amount of VRAM, which is the same. So that costs money. However, as noted yesterday by Hardware Unboxed the 3070 performs up to 38% slower than the 3080. This is at 4k. However, the 4k results are hampered by VRAM limitations. Either the VRAM isn't enough (see Doom Eternal) or it's not got enough bandwidth for 4k. When not hampered by those issues the 3070 is only 27% slower than the 3080.

Which basically means that the 3070 is not a 4k card. It costs (just checked the only source I can) £529. So if Nvidia were meeting their MSRP? I would likely agree with you, the 6800 costs too much. However, I find £529 for a 1440p card (because it is not a true 4k card as can be seen by Nvidia showing it at 1440p in their reviewer's guide) too much. Sure, it is faster than the 2080 Super and costs a bit less, and sure it performs as well as a 2080Ti @ 1440p (but not 4k). However, there were already two fantastic 1440p cards (2070s and 2080s). If you desperately need the extra performance at 1440p right now? it may be worth buying. Buying it as a 4k card though would be very risky.

So that brings us back to the 6800. This will not suffer those issues. It has better memory bandwidth and twice as much. Thus, it *is* a bonafide 4k card. Which the 3070 could have easily been had Nvidia not derped the VRAM. I guess they did not want it hurting their 3080 sales, but we all know how much they cost.

So for me? I would rather have seen the 3070 have at least 10gb VRAM to make it a proper all out 4k card and it cost what the 6800 does. Or in other words? I would rather buy the 6800.

Apparently GN say that the reason it's that expensive is because AMD decided to use SAM on it. To me though? like I say, I just believe it's a proper 4k card where the 3070 is not.

I just wish Nvidia had released it with more VRAM as a proper 4k card. 4k gaming was supposed to be getting much more affordable, and now they have gone and done this. 3080s cost about £800 at the moment, that's no cheaper than many past 4k cards. In fact in the case of the 1080Ti it's considerably more.

I also have a feeling Nvidia will ditch Samsung and go back to TSMC as quick as they can, which concerns me if the drivers need to be different to the Samsung Ampere ones. Either way they can not just continue on with Samsung as if AMD drop prices after launch they will be absolutely screwed.

Remember that Steve also showed a benchmark at 4k with the 3070 in Doom Eternal using Ultra texture settings rather than Ultra Nightmare, and performance issues (which were still only minor really) disappeared. I recognise know you have a particular view on this, but going by how many people cannot tell the difference visually between the highest texture setting and one or two notches below in Doom Eternal, I think the 3070 is perfectly fine for 4k if you're willing to make one or two insignificant sacrifices. And given that the 3070 is not actually capable of running all games at max settings at 60 FPS, I think it's unreasonable to say the 3070 is not a 4k card because of VRAM and would be if it had 2GB more, when ignoring the raw compute power and its inherent limitations. We've established you're not willing to make any sacrifices though when it comes to VRAM, and that's fine. I would be and a lot of others would be too. So I'd say the 3070 is a 4k-capable card for those who are OK to make a few small compromises. If you're not, it's not the card for you.

Then again, I think you're right about the 6800 being better at 4k and therefore better value. It performs better, has more VRAM, and more bandwidth. If I gamed at 4k or was going to and didn't have a huge budget, I'd definitely favour the 6800 over the 3070. I'd still prefer the 6800XT, but if the 6800 was already on the edge of my max budget, I'd still squeeze for the 6800 over the 3070, for all the reasons you gave.

The 3070 is probably the card I'd get for 1440p gaming though, because the 6800 doesn't offer enough features or value over it to make it worth it. Or the 6700XT or 6700 might be even better suited for 1440p, because the 3070 is arguably OP at that resolution for a guy like me who doesn't care about cranking every setting to the max in every single game (even though I don't play every single game). ;)
 
These gpu's are being produced on a very mature node, with a well established relationship between the two companies. The bus width is only 256 bit. I'm not sure how much complexity infinity cache adds but, I wager the yields will be pretty good for these dies. Using gddr 6 which is cheaper and considering the design familirty I'd say AMD have got power delivery pretty sorted too. Robust power delivery means no complex expensive pcb designs.

I think AMD stand to make good profits from this line up if the performance is there. They have priced it competitively but probably have room to drop prices if Nvidia try something.

Sorry if the post is a bit incoherent, it was a bit of a stream of consciousness.
 
Remember that Steve also showed a benchmark at 4k with the 3070 in Doom Eternal using Ultra texture settings rather than Ultra Nightmare, and performance issues (which were still only minor really) disappeared. I recognise know you have a particular view on this, but going by how many people cannot tell the difference visually between the highest texture setting and one or two notches below in Doom Eternal, I think the 3070 is perfectly fine for 4k if you're willing to make one or two insignificant sacrifices. And given that the 3070 is not actually capable of running all games at max settings at 60 FPS, I think it's unreasonable to say the 3070 is not a 4k card because of VRAM and would be if it had 2GB more, when ignoring the raw compute power and its inherent limitations. We've established you're not willing to make any sacrifices though when it comes to VRAM, and that's fine. I would be and a lot of others would be too. So I'd say the 3070 is a 4k-capable card for those who are OK to make a few small compromises. If you're not, it's not the card for you.

Then again, I think you're right about the 6800 being better at 4k and therefore better value. It performs better, has more VRAM, and more bandwidth. If I gamed at 4k or was going to and didn't have a huge budget, I'd definitely favour the 6800 over the 3070. I'd still prefer the 6800XT, but if the 6800 was already on the edge of my max budget, I'd still squeeze for the 6800 over the 3070, for all the reasons you gave.

The 3070 is probably the card I'd get for 1440p gaming though, because the 6800 doesn't offer enough features or value over it to make it worth it. Or the 6700XT or 6700 might be even better suited for 1440p, because the 3070 is arguably OP at that resolution for a guy like me who doesn't care about cranking every setting to the max in every single game (even though I don't play every single game). ;)

The 6800 by all accounts seems to be quite a bit faster than the 2080Ti in the benchmarks shown at least. The 3070? is total level peggings at 1440p, falls off at 4k. The 6800 won't do that though.

Honestly to make it shorter? 10gb. 10gb would have been the sweet spot for the 3070 and I would have nothing at all to complain about.

It's not just Doom btw.

Even Wolfenstein 2 uses more than 8gb at 4k. So does Shadow Of Mordor and that is years old now.

It won't get better mate. Game devs now have double the VRAM for console games, what do you think that will do for PC gamers? most probably it will double the price of entry (figuratively speaking of course).
 
The 6800 by all accounts seems to be quite a bit faster than the 2080Ti in the benchmarks shown at least. The 3070? is total level peggings at 1440p, falls off at 4k. The 6800 won't do that though.

Honestly to make it shorter? 10gb. 10gb would have been the sweet spot for the 3070 and I would have nothing at all to complain about.

It's not just Doom btw.

Even Wolfenstein 2 uses more than 8gb at 4k. So does Shadow Of Mordor and that is years old now.

It won't get better mate. Game devs now have double the VRAM for console games, what do you think that will do for PC gamers? most probably it will double the price of entry (figuratively speaking of course).

You're just repeating the same arguments at this point.

Saying it's "not just Doom" doesn't mean it's the majority or even minority. Are you really basing your purchases on 0.1% of the gaming catalogue? If you are, that's cool. But why presume everyone else will? It ignores the fact that consumers can choose to turn one or two superfluous settings down to be within their VRAM limit, as you may have to do in some games at 4k anyway. You act as if people are buying SLI GTX 770's for modern games. You're taking a small handful of adjustable instances and blowing them out of proportion.

The console argument is just your guess. I've explained in another post a few weeks ago what happened with the previous generation. When the PS4 and Xbox One came out with its 8GB of collective VRAM, what happened? Did all games suddenly jump to 5-6GB of VRAM utilisation? No, even at 4k. For years, literally years, games still only needed 3-4GB of optimally even at 4k. Benchmarks across the board PROVE this. It wasn't until very recently that games actually started to draw upwards of 8GB of VRAM. The consoles had more VRAM than most PC gamers, yet games still ran perfectly well on 4GB cards. There were a small handful of outliers, and in those cases you could just drop the textures slightly and see no visible difference and gain an extra 5-10% performance.

And that's a whole other point, the performance you lose by not being in the sweet spot of VRAM is not huge. In some cases, such as at 4k in one or two titles, the limited VRAM of cards like the Fury X and GTX 980 caused an unpleasant experience. But again, outliers. I don't base my decisions on the 0.1%, especially when I can reduce that percentage even further by reducing one or two superfluous settings. And not everyone even plays Wolfenstein or Doom Eternal! You buy the card that suits your needs, not what someone on the Internet tells you to buy for reasons that impact them and them alone.

I guess I'm repeating myself as well at this stage. :D

That said, I do think VRAM usage will increase over the next two years, more than the increase from the previous gen consoles. I just don't agree with your gloomy outlook. I've seen very little evidence to feel like a 3070 will become obsolete within a year. Seeing as it's better suited to 1440p already, if there are one or titles with higher VRAM amounts, just do what PC gamers have been doing for 30 years and play the game settings until you find your sweet spot. That's what they're there for.
 
You're just repeating the same arguments at this point.

Saying it's "not just Doom" doesn't mean it's the majority or even minority. Are you really basing your purchases on 0.1% of the gaming catalogue? If you are, that's cool. But why presume everyone else will? It ignores the fact that consumers can choose to turn one or two superfluous settings down to be within their VRAM limit, as you may have to do in some games at 4k anyway. You act as if people are buying SLI GTX 770's for modern games. You're taking a small handful of adjustable instances and blowing them out of proportion.

Then my viewpoint is buy a console.

As for whether I would be bothered about a game I absolutely do play all of the time? yes, yes I would be bothered about it. Especially given the 3070 costs £550 at the time of typing this for a pre order, the rest of the kit costs about a grand (including chair, monitor etc) so yeah, I would not want to compromise. That is why I game on a PC, so I don't have to compromise. When I do? I sit on the sofa and play Xbox.

As for blowing it out of proportion? oh man. If you had been burned like I have on cards that were supposedly absolutely fine and had more than enough VRAM you would get it. That is what I am trying to stop people from doing.

I could sit here and go on about it all day. 8gb? the 1070 had 8gb years ago. Why have they seen fit not to upgrade that? etc etc yada ya.

It's one of those things I guess you need to experience and be burned by to p*ss you off enough to consider it an important factor.

But at £100 more than the cost of an entire Xbox SX I would not want to compromise thank you no.
 
If you can get away with using a console then great alien, go and do it. But telling other people to buy a console when they need a GPU is like telling someone who needs a new truck to go and buy a Vauxhaul Corsa because it does everything you personally want anyway and costs less and is easier to maintain.

If you get wee'ed off at turning down settings and that makes you feel burn't then fine, base your decisions on that and buy devices that have all the settings already turned down for you, but it means your advice on this topic is irrelevant to 99% of PC gamers who do this day in day out, most PC gamers don't spend over £200 on their GPU, and still enjoy their experience.
 
Last edited:
If you can get away with using a console then great alien, go and do it. But telling other people to buy a console when they need a GPU is like telling someone who needs a new truck to go and buy a Vauxhaul Corsa because it does everything you personally want anyway and costs less and is easier to maintain.

If you get wee'ed off at turning down settings and that makes you feel burn't then fine, base your decisions on that and buy devices that have all the settings already turned down for you, but it means your advice on this topic is irrelevant to 99% of PC gamers who do this day in day out, most PC gamers don't spend over £200 on their GPU, and still enjoy their experience.

Firstly people don't "need" a GPU. If they do they have issues I can't solve.

The new consoles are set to be 4k. Meaning that they will offer a very similar experience to a GPU costing more. With reduced settings of course. This isn't about cars.

As for turning down settings? I tried that on my Fury X. It didn't work. The textures at 4k were simply too large to fit the VRAM buffer so the card would black screen. I could have dropped the res of course. However after spending £1200 on two I think that would have been a bit crap, given I had a 4k monitor at the time.

I don't think my advice is irrelevant, but thanks any way. People are free to do exactly what they wish.
 
Firstly people don't "need" a GPU. If they do they have issues I can't solve.
Come on man, seriously? In this age of WFH? My point is not that your advice won't apply to anyone, it's that everyones situation is different, it's that you can give guidance without telling people they should take a wildly different route to the point of likely irrelevance to them.

Telling other people they don't need a GPU is like telling other people they don't need a certain type of screwdriver, how can you know what tools they need to make their bread? There are so many fields that benefit from GPGPU now. How do you even know that updating their home PC that they already own games isn't a cheaper route than buying into a new ecosystem? For sure give guidance and advice, but saying "Just buy a console" genuinely doesn't help anyone, in the same way telling someone wanting to buy a gaming laptop to buy a desktop PC instead is useless advice, and repeating it over and over again in several threads doesn't help any more.

To be honest at this point it doesn't really look like you're giving advice at all, especially given where you're saying it, it just seems like a load of Choice-Supportive Bias influenced rants.
 
Last edited:
As for turning down settings? I tried that on my Fury X. It didn't work. The textures at 4k were simply too large to fit the VRAM buffer so the card would black screen. I could have dropped the res of course. However after spending £1200 on two I think that would have been a bit crap, given I had a 4k monitor at the time.

Maybe this is where all this comes from. The Fury X really was VRAM limited, and those that were convinced by AMD's legitimate spiel must have been peeved off. I don't wanna be a peehead and rant and rave about the same thing. I'll be honest, it drives me bonkers seeing the same arguments repeated over and over again when they don't make sense. But maybe this isn't even about that; maybe it's more about how AMD messed up big time and put a great deal of consumers in a difficult position, all because of VRAM.
 
Maybe this is where all this comes from. The Fury X really was VRAM limited, and those that were convinced by AMD's legitimate spiel must have been peeved off. I don't wanna be a peehead and rant and rave about the same thing. I'll be honest, it drives me bonkers seeing the same arguments repeated over and over again when they don't make sense. But maybe this isn't even about that; maybe it's more about how AMD messed up big time and put a great deal of consumers in a difficult position, all because of VRAM.

Actually that is not where it all comes from. It happened to me with the GTX 470 also. A card I retired for no other reason than it not having enough VRAM.

I'm not going to keep doing this to death. I am allowed, nay, entitled my opinion and that's it. I didn't anywhere say that anyone should listen to me, nor do I care if they do or don't. It seems to be an opinion shared by many reviewers, but it was mine before they even said anything. We are about to take a big leap in gaming for many reasons, and I just don't think it will be enough for 4k which is what this card should be based on the tier level of performance. I don't find that unreasonable.

I also feel that as I type this (again with no crystal ball) that it's overkill for 1440p. That's all. I am allowed that opinion too.

I have often said I would rather have seen this card with more VRAM at a higher price. I also retain that opinion too. Probably because I don't often upgrade my GPUs, so even as a paid for safety net I would have preferred it to be there. Again, I am entitled to feel that way.

As for AMD? tbh? it wasn't AMD who made those claims. It was every one with their opinions. And, their opinions turned out to be wrong. No, being faster (the memory itself) does not mean it can break the physics of how games load into and out of VRAM and make magic things happen. That is the crap being spread around at the time by reviewers. "It's more than enough because it's so fast !". Well guess what? that opinion was wrong.

TBH? if people didn't continue arguing with me over it then I would stop posting about it. It just seems that every one (not including you in this) just wants to be right when tbh? they have no idea whatsoever what the future holds. And it is because of this that I find the stingy amount of VRAM this card possesses to be just that, stingy. Especially when going back a few generations the cards had the same.
 
Back
Top