AMD's RDNA 2 Silicon, CU counts and clock speeds reportedly confirmed

I'm genuinely excited for this launch, I'm really hoping that AMD will have something that will revolutionize the graphics market. The same way they did with the CPU market.
 
If these numbers are correct no wonder Nvidia rushed the 30 series out. I'm really looking forward to the independant reviews. This could be a big game changer in the GPU market
 
Can't say I really see the appeal of them, TBH.


No DLSS equivalent and this will be AMD's first generation Ray Tracing hardware and I'm willing to bet real money it will be slower than Nvidia's second generation Ampere Ray Tracing hardware.



If all you care about is basic rasterisation performance, then fine I guess, I'm sure they'll be a match, or very close to, the 30xx cards at a (albeit not much) cheaper price.


..but I've grown to like my DLSS & Ray Tracing bells & whistles and I wouldn't really contemplate moving to a card without those features, or at least, slower performance of the AMD equivalent.
 
Can't say I really see the appeal of them, TBH.

No DLSS equivalent and this will be AMD's first generation Ray Tracing hardware and I'm willing to bet real money it will be slower than Nvidia's second generation Ampere Ray Tracing hardware.

If all you care about is basic rasterisation performance, then fine I guess, I'm sure they'll be a match, or very close to, the 30xx cards at a (albeit not much) cheaper price.

..but I've grown to like my DLSS & Ray Tracing bells & whistles and I wouldn't really contemplate moving to a card without those features, or at least, slower performance of the AMD equivalent.

Well on the DLSS front AMD have been working with both Microsoft and Sony to implement an open standard for Windows and the upcoming consoles based on Microsoft's Direct ML API i.e direct machine learning which is all Nvidia's DLSS really is, Nvidia just have their own version of it so they can control it more.

Ray tracing will likely be game dependant but I'm thinking it will be between Turing and Ampere as far as performance goes, Or they could surprise us and it may be on par with Ampere, Time will tell.
 
Well on the DLSS front AMD have been working with both Microsoft and Sony to implement an open standard for Windows and the upcoming consoles based on Microsoft's Direct ML API i.e direct machine learning which is all Nvidia's DLSS really is, Nvidia just have their own version of it so they can control it more.

Ray tracing will likely be game dependant but I'm thinking it will be between Turing and Ampere as far as performance goes, Or they could surprise us and it may be on par with Ampere, Time will tell.

I think this is the case.

AMD will have an alternative to DLSS that will be an open standard that is slightly inferior but still absolutely sufficient, just like Freesync is to Gsync. And their Ray-Tracing performance will be adequate, but still behind Ampere. I can imagine RT will be on par with the 2080Ti. Seeing as that technology will be in the consoles, I can imagine it'll be enough for the majority of gamers and the few games that do and will support RT.
 
DLSS is a big selling point for Nvidia, but it works only in a few games, so it would be rendered useless if AMD comes up with a solution that runs in every game. That would be a huge win for them.
 
Last edited:
It's as i expected, i've been looking forward to these cards a long time, i spoke of 80CU's a fair while ago now even before the leakers started talking about it.

As for DXR and DX12 Ulitmate both amd and nvidia will be fully featured support, so yea sure Nvidia have had a head start in terms of hardware out for a few years now as well as their own tools and software versions supported, but amd have the same options to implement there own in the same way it's only code, add to that the things they have worked with microsoft and sony directly it should be very decent.

Until the 28th we don't know their real plans or software support and such or if current DXR games will get patched to take advantage of RX6000 ability, but i'd expect that it will or at least in some form.

I think these cards will release on the week starting 9th Nov onwards in line with the consoles, they will have better supply cause I think the cards are pretty much ready, drivers i'd expect are also pretty ready bound to be tweaking them still and testing.

AMD have done really well at keeping quiet, for the sole reason they are not letting the AIB's into anything much they will make the cards send them to the AIB's then give the AIB's the solid info and drivers and such.

By keeping them out of the loop for so long means they have kept the bulk of info in house, we really don't have much to go on just yet a lot of rumours and tiny leaks.

I think my choice is going to be easy i somewhat made it last week but now I feel rather than 3080 or rdna2 it's really come down to AMD is the choice regardless for me.

I don't expect them to be cheaper and I kinda feel the top card might actually be a lot faster than even i feel so could be priced above the 3080 by a fair ways.

It'll hopefully not dissapoint thou still looking forward to the CPU offerings on the 8th not cause I'll buy one but maybe at a later date it's an option to upgrade, still happy with my Ryzen 1700 atm.
 
Last edited:
I think this is the case.

AMD will have an alternative to DLSS that will be an open standard that is slightly inferior but still absolutely sufficient, just like Freesync is to Gsync. And their Ray-Tracing performance will be adequate, but still behind Ampere. I can imagine RT will be on par with the 2080Ti. Seeing as that technology will be in the consoles, I can imagine it'll be enough for the majority of gamers and the few games that do and will support RT.

AMD have good form on things like that though. Their sharpening thing was better than Nvidia's, TressFX was better than Hairworks and so on. They have even made their own API.

I'm not terribly fussed about RT at least on this gen. When things are fully traced maybe it will get my attention more.
 
AMD have good form on things like that though. Their sharpening thing was better than Nvidia's, TressFX was better than Hairworks and so on. They have even made their own API.

I'm not terribly fussed about RT at least on this gen. When things are fully traced maybe it will get my attention more.

Same, I honestly could not give less of a poo about ray tracing, We have real time local reflections, Ok it doesn't give a mirror image of what is around but it still looks good without the huge performance impact and standard ambient occlusion methods have come a long way, Look good and again no massive performance impact.

The only thing I care about with RDNA2 is raster performance for my Index for that 144Hz goodness ^_^
 
Last edited:
To see AMD's raytracing at work, just watch some of the console trailers and such seen plenty of it now, it looks decent it's just maybe more subtle but if you look at some of them you can't say that all them games look bad by any means.

DXR matters to me, but I don't need some marketing speak version RTX so long as what i get has the API support I couldn't care less :D
 
Drivers have always been AMD's weak point, if they can nail those alongside the hardware I'm quietly optimistic they will have a strong line up this time around - factor in if they can also manage to keep their power consumption lower than some of the Nvidia options...

Oh and availability in numbers at launch (here's hoping anyway).
 
DLSS is a big selling point for Nvidia, but it works only in a few games, so it would be rendered useless if AMD comes up with a solution that runs in every game. That would be a huge win for them.

I'd hope we'd have a choice.

Go AMD for DLSS in ALL games with a good boost to performance.

Go NV for DLSS in some games with a larger boost to performance.

That's how I can see it going.
 
It would be very impressive if AMD have managed to train neural networks that can perform DLSS like upscailing on general game input data, but yep definitely not impossible given enough time, money and resources. With the might of Microsoft & Azure behind them, I guess they might have had a chance to have a good crack at it. Though with all of NVidia's resources I don't think there's any doubt they're aiming for that eventually with DLSS too, probably as a fallback option from DLSS trained games at first.
 
I thought not requiring game specific training was one of DLSS 2.0's marketed features?
It's been rumoured as a possible goal for DLSS 3.0 but besides that nah, for the user DLSS 2.0 was essentially just DLSS 1.0 but good, though it's easier to implement for devs
 
Last edited:
Oh fair enough, DLSS3 rumours must be on changes on the integration side to make it practically, from a users perspective, one implementation for all games
 
Back
Top