Watch's AMD's Radeon RX 6000 Series Reveal Here

Nice to see such a competitive product, not quite going to cancel 3080 but depending on reviews I might order 6800 XT alongside and keep the card which arrives first.
 
DvLdQfm.gif


It was totally

DvLdQfm.gif
 
Looking like the 6800XT for me unless the price conversion sucks, but tbh the prices were a little bit off from my expectations but not massively, but the 6800XT seems to be the best pick of the 3 for me, it was at least spot on with what i expected.
 
Looking like the 6800XT for me unless the price conversion sucks, but tbh the prices were a little bit off from my expectations but not massively, but the 6800XT seems to be the best pick of the 3 for me, it was at least spot on with what i expected.

6800xt was exactly where I expected really.

6800 is more. However, the 6800 has 16gb so is much more viable as a 4k card especially going forward. So the price doesn't really surprise me. It's basically slightly faster than the 3070 and has double the VRAM. However, that is what holds the 3070 back at 4k already, so I would rather pay a little more and feel safer when the next gen console drivel drops.
 
6900XT sharing the 300W TDP with 6800 XT while rocking the same clock speeds raises questions about binning and as such, availability. Though I suspect AMD has followed 3000 series launch closely and wouldn't want to find themselves in a similar PR catastrophe.
 
All depends on how these prices convert to £'s but they are in the right ball park only real one of the three i feel is slightly more than i expect is the 6900XT but in fairness it is at least 500 cheaper than the 3090 lol

With cyberpunk being put back a few weeks longer it gives me some extra time to save for the 6800XT it's by far the best priced of the three cards, but the 3070 has been very much been slapped about a bit now 6800 might only be £50 more but for 8gb more ram :)

oh and they barely spoke of it but some kind of dlss option but the fact they didn't go into details means it could be a ways off from being implemented.
 
The price of the 6800 is too damn high! I can see Nvidia responding with a 3070Ti for $550 and completely nullifying what could be Big Navi's best value card. I know the 6800 will have more VRAM, but if i were in the market for a GPU in the £500-600 category, AMD haven't brought anything to the table that Nvidia can't respond to with aplomb.

The 6900XT is a surprise. $1000 is a lot, but I guess it's cheaper than the 3090. Again, it leaves room for Nvidia to come out with a 12GB 3080Ti for $850 that offers better value for money. Don't know why AMD are leaving Nvidia room to do that, but whatever; I'm sure they have their reasons.

The 6800XT is their best card in my opinion. Cheaper than a 3080, performs the same, more VRAM, lower power draw, probably better availability. Nice win there. If they can get a steady supply of $650-700 variants from themselves and AIB partners, they'll take a vast amount of sales from Nvidia. Even without upscaling it's a good card.

Speaking of which, they're working on upscaling technology, but I was disappointed to see it absent. DLSS 2 is a big seller for Nvidia. It's their biggest selling point in fact, and AMD didn't have a proper retort for it. Hopefully it'll come this year with driver updates.

Clock speeds are lower than expected slightly. But it doesn't matter; performance is where it should be and power draw looks to be reasonable.

Overall the biggest disappointment was the price of the 6800. I don't know what they're doing there. A lack of upscaling is not that big of a deal because we know it's coming. But pricing the 6800 so high is allowing Nvidia too much room to wiggle their way back into the heart's and mind's of consumers. I can see a lot of people who were waiting for RDNA2 before buying a 3070 now being happy to pull the trigger on one. AMD haven't really offered an apples-to-apples alternative, at least from what I can tell.
 
I'm pretty impressed by these. The 6900XT giving you basically 3090 performance but at $500 less?!?! That's pretty huge! The 6800XT being $50 cheaper than the 3080 but with the same performance is nice.

I'm not that big a fan on the $570 price of the 6800 though. I get that it's faster than a 3070 but I think if they'd priced it at $500 it would've sold faster than anything we've probably ever seen before.

I'm also very impressed with Dr. Lisa Su. She took over a company that was headed for ruin but she's turned it around and put it on top of not one but TWO fronts which were once ruled by companies with much bigger bank rolls. She needs to get into politics. I'd vote for her for president lol.

That said, I think I'll end up with a 3070 or 3080. I really like my G-Sync monitor and its only a year and half old so not ready to give up on that $500 investment. Also, I just like G-sync better. I had a Freesync monitor before this one and this one just seems smoother. I know that could be placebo or the result of a lot of other things but to my eyes at least right now I prefer G-sync.
 
I think AMD has plenty of wiggle room to counter possible Super series with a price drop, since for example they're using cheaper GDDR6. Though there's a possibility that 128MB L3 is fairly expensive to etch, etc.

What will also be interesting, and we certainly won't hear from AMD about it, is performance through PCI-E 3.0 link. We do know that the Navi cards with only 8 PCI-E lanes suffered greatly on PCI-E 3.0, and it's not unimaginable that these much more powerful cards would face the same issues even at 16x.

I hope their driver stack is rock solid now, though, and since the same architecture is used in the consoles, I have higher hopes of them achieving that.

Currently it does seem that 6800 XT is great for higher refresh rate 1440p, where DLSS is less relevant, and 3080 maybe has the edge overall at 4K.

@MacLeod Your better experience with G-Sync is explained by the stricter requirements for the badge. There's plenty of great FreeSync monitors, but garbage hardware can also wear the badge without issues as long as they stick to minimum specs. So studying reviews is required (though IMO buying anything expensive should be thoroughly researched).
 
Last edited:
The price of the 6800 is too damn high!

I don't think so. I gave the short answer before but the longer answer is more in depth.

Firstly it obviously has double the amount of VRAM, which is the same. So that costs money. However, as noted yesterday by Hardware Unboxed the 3070 performs up to 38% slower than the 3080. This is at 4k. However, the 4k results are hampered by VRAM limitations. Either the VRAM isn't enough (see Doom Eternal) or it's not got enough bandwidth for 4k. When not hampered by those issues the 3070 is only 27% slower than the 3080.

Which basically means that the 3070 is not a 4k card. It costs (just checked the only source I can) £529. So if Nvidia were meeting their MSRP? I would likely agree with you, the 6800 costs too much. However, I find £529 for a 1440p card (because it is not a true 4k card as can be seen by Nvidia showing it at 1440p in their reviewer's guide) too much. Sure, it is faster than the 2080 Super and costs a bit less, and sure it performs as well as a 2080Ti @ 1440p (but not 4k). However, there were already two fantastic 1440p cards (2070s and 2080s). If you desperately need the extra performance at 1440p right now? it may be worth buying. Buying it as a 4k card though would be very risky.

So that brings us back to the 6800. This will not suffer those issues. It has better memory bandwidth and twice as much. Thus, it *is* a bonafide 4k card. Which the 3070 could have easily been had Nvidia not derped the VRAM. I guess they did not want it hurting their 3080 sales, but we all know how much they cost.

So for me? I would rather have seen the 3070 have at least 10gb VRAM to make it a proper all out 4k card and it cost what the 6800 does. Or in other words? I would rather buy the 6800.

Apparently GN say that the reason it's that expensive is because AMD decided to use SAM on it. To me though? like I say, I just believe it's a proper 4k card where the 3070 is not.

I just wish Nvidia had released it with more VRAM as a proper 4k card. 4k gaming was supposed to be getting much more affordable, and now they have gone and done this. 3080s cost about £800 at the moment, that's no cheaper than many past 4k cards. In fact in the case of the 1080Ti it's considerably more.

I also have a feeling Nvidia will ditch Samsung and go back to TSMC as quick as they can, which concerns me if the drivers need to be different to the Samsung Ampere ones. Either way they can not just continue on with Samsung as if AMD drop prices after launch they will be absolutely screwed.
 
Pretty hot take to call the 3070, which can maintain 4k60 at high settings in most modern games, "not a 4K card".
 
Pretty hot take to call the 3070, which can maintain 4k60 at high settings in most modern games, "not a 4K card".

I'm not going to spend hours and hours going through it all again.

8gb is not even enough now, and you can't even get one yet. What do you think will happen when devs get access to double the amount of VRAM they have now?

But the simplest way of putting it? why was it not marketed as a 4k card? the reviewer's guide is for 1440p which shows it in the best light. There's a reason for that.
 
I'm not going to spend hours and hours going through it all again.
I have a feeling you will.
8gb is not even enough now, and you can't even get one yet. What do you think will happen when devs get access to double the amount of VRAM they have now?

But the simplest way of putting it? why was it not marketed as a 4k card? the reviewer's guide is for 1440p which shows it in the best light. There's a reason for that.
It is enough, but they'd rather have you pay the premium for 3080.
 
4K done via DLSS (IE rendered at 1440p) is no more fake(And arguably less fake) than the tricks used to produce 4K video outputs on many great looking games on the consoles, and seemingly could be more or less equivalent in principle to what the Xbox Series will use, so I wouldn't say 8GB is totally not enough for a good 4K experience, just maybe not enough for a perfect native 4K experience for an extended time period.
 
It's not enough. End of.

the VRAM allocation makes most sense for 1440p gaming. 8GB of GDDR6 memory is fine for 1440p ultra settings right now, thought at 4K we are starting to see one or two titles eat more than that.

https://www.kitguru.net/components/...s/nvidia-rtx-3070-founders-edition-review/31/

Now we can argue about the nature and choice of an 8GB GDDR6 framebuffer, but in most use cases, it's enough, and we understand the choice made here; NVIDIA needs to keep that bill of materials healthy, as otherwise, this 499 USD card would have easily been 650 USD based on that 16GB decision. With future games in mind, this will turn out to become a WQHD (2560x1440) resolution card


https://www.guru3d.com/articles_pages/geforce_rtx_3070_founder_review,32.html

At the end of the day, there is only one topic to be viewed critically with the GeForce RTX 3070: the memory expansion. At 8 GB, it is on par with the GeForce GTX 1070, which was released four years ago in June 2016. As of today, 8 GB for WQHD and thus the primary resolution of the GeForce RTX 3070 is usually still enough, only Ghost Recon Breakpoint wants more in this resolution on the course. But after four years at this level and next-gen consoles on the doorstep, the likelihood that more memory will be useful in the future for the highest and highest details in WQHD has increased significantly at the end of 2020. This is even more true for UHD.
Apart from the memory, the GeForce RTX 3070 does everything right



https://www.computerbase.de/2020-10/nvidia-geforce-rtx-3070-test/4/#abschnitt_fazit

should you ever feel VRAM is running out, just sell the RTX 3070 and buy whatever card is right at that time

https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/41.html

The only potential gotcha is the card’s 8GB of memory. For the vast majority of games available today, 8GB should be adequate with maximum image quality, even at high resolutions, but moving forward that 8GB of memory may require some image quality concessions to maintain smooth framerates.

https://hothardware.com/reviews/nvidia-geforce-gtx-3070-ampere-gpu-review?page=4

you only have 8GB on the RTX 3070. But this is enough for the current and even next-gen games coming if you're only playing at 1080p, 1440p, and 3440 x 1440

https://www.tweaktown.com/reviews/9...unders-edition/index.html#Whats-Hot-Whats-Not

Some might have wanted to see more than 8GB of GDDR6 memory on the GeForce RTX 3070, but that shouldn’t be an issue on current game titles for 1440P gaming. If a game comes out in the future that needs more than 8GB for ultra image quality settings then the solution would be to just change the game settings

https://www.legitreviews.com/nvidia-geforce-rtx-3070-founders-edition-review_222986/15

the amount of memory of the RTX 3070 is also a point of discussion, because in 2020 the release of a video card of more than 500 euros that with eight gigabytes has the same amount of video memory as its two predecessors makes us somewhat frown

https://tweakers.net/reviews/8274/n...-edition-high-end-prestaties-voor-minder.html

Nvidia never tires of emphasizing that the storage capacities of the RTX-30 graphics cards were chosen deliberately, as they are sufficient . Our observations and measurements say otherwise - we don't look at the margin, after all. In fact, the Geforce RTX 3070 8GB shows symptoms of memory deficiency a little more often in the WQHD intended for it than the Geforce RTX 3080 10GB in Ultra HD. Both models are supported by gracious streaming systems, which address the graphics memory up to a percentage upper limit and also simply omit texture details. We are really not telling loyal PCGH readers anything new hereLet me tell everyone else: What is still enough today may be too little tomorrow. We are about to launch a new generation of consoles, which will bring new multiplatform games with new graphics and new standards. Although new technologies such as Variable Rate Shading and DirectStorage are in the starting blocks to increase efficiency, experience shows that it will take a long time for these ideas to penetrate the market. We are talking about years, not months. On the other hand, there is the fact that most developers do not have the time to optimize their work for months. There will always be games that literally run like nuts. PC gamers like to deal with this problem with strong hardware - but the Geforce RTX 3070 lacks this buffer. Memory hogs like Horizon: Zero Dawn, Ghost Recon Breakpoint, Wolfenstein Youngblood and a few others are already showing where the journey is headed. Ray tracing makes things worse due to the increased memory requirement.

https://www.pcgameshardware.de/Gefo...-Ti-Release-Benchmark-Review-Preis-1359987/4/




Best one for me was tpu, just punt it when it runs oot!:p

Quoted from another forum. Nearly every tech press rep disagrees with you, as does HWUB.

OY0AjkV.jpg


https://www.youtube.com/watch?v=UFAfOqTzc18&ab_channel=HardwareUnboxed


14:30.

"Increasing the resolution to 4k removes any and all CPU bottlenecks. And here we can see the 3070 is between 19 and 37% slower than the 3080. Though when not limited by VRAM capacity it's only up to 28% slower"


So there you have it, 24 hours of info in one post.

If you are still a "non believer" then it's like arguing over whether Covid is real and 5g sets fire to your testicles. IE - whatever, mate.
 
Last edited:
But who in the real world actually forces themselves to play all games at max settings?

Surely "Not a 4K card" means "You can't have a good experience at 4K output res with this card and contemporary games" rather than just "It takes more of a performance hit on 4K than some other over provisioned cards do"
 
Last edited:
But who in the real world actually forces themselves to play all games at max settings?

Right. Then we get into the "Drop the settings !" argument.

However, you do that? may as well buy a XBSX. As it will be doing the exact same thing for 4k. IE, a £529 card with £500 more gear added to it to make it run, then £x for a monitor keyboard, mouse, desk, chair, mouse mat ETC ETC?

dropping settings is compromising. If you are gonna do that? get a bloody console.
 
Slightly turning down some badly optimised settings you can't notice the visual effect of anyway (Lets be honest every game has them, usually a few) means you may as well buy a completely different class of device that could very well not meet your requirements out of a PC either in game support, setup, or other uses, and could cost you more than a GPU upgrade for your existing system while causing you to lose all your library?
 
Last edited:
Back
Top