AMD demos RDNA 3 GPU and Ryzen 7000 series processor running Lies of P

50% increase is quite impressive for rasterization tasks, if it's for RT performance and they aren't mentioning that, it's not as impressive considering how bad they were before.
 
50% increase is quite impressive for rasterization tasks, if it's for RT performance and they aren't mentioning that, it's not as impressive considering how bad they were before.


Unless AMD have another 9700 Pro moment, they'll always be playing catchup to Nvidia with Ray Tracing.
 
Unless AMD have another 9700 Pro moment, they'll always be playing catchup to Nvidia with Ray Tracing.

To be fair until RT becomes essential (IE in every game) none of that matters.

If I drop res/settings I can max it out on the AMD card I have. Sure, it would perform way less than Nvidia but when you have about three titles that are actually decent with it it really doesn't matter.

Refusing to pay £55 for DL2 I found a naughty copy the other day. Installed it, and once again the RT effects are just full of grit and artefacts. So it kinda makes me wonder if it's really worth bothering with at all.

It apparently makes a huge difference in Spiderman. So I transferred that to my 2080Ti rig on the TV and tbh? I didn't notice any difference. I deliberately did not enable it on the desk rig either, hoping to see a massive difference. Fact is when you are flying along on the web you are focussed on where you are going, not what is passing you by.

Even without RT the game looked absolutely gorgeous.

I think the main difference this round will be price. Apparently AMD have been able to cram a lot more into a smaller die, lowering the infinity cache because it didn't make enough of a difference, and thus will be able to compete on what counts - price.

And this round? well mining has gone, so they are going to have to behave themselves.

From all accounts this could be AMD's Maxwell moment, and that's friggin HUGE.
 
Unless AMD have another 9700 Pro moment, they'll always be playing catchup to Nvidia with Ray Tracing.

AMD just needs more software engineers. Using them to create features in tandem with GPU design like Nvidia does. That's really there only downfall.
 
Unless AMD have another 9700 Pro moment, they'll always be playing catchup to Nvidia with Ray Tracing.
Don't worry. They don't have to play catch-up with their pricing.

If AMD aims to have a Maxwell moment, as AlienALX put it, they're gonna have to offer much more aggressive price points. Which I doubt will happen. Don't forget how relatively cheap the GTX 970 and the GTX 980 were.
 
Last edited:
Don't worry. They don't have to play catch-up with their pricing.

If AMD aims to have a Maxwell moment, as AlienALX put it, they're gonna have to offer much more aggressive price points. Which I doubt will happen. Don't forget how relatively cheap the GTX 970 and the GTX 980 were.

Believe it or not dude a lot of the pricing at retail comes down to what they cost to make.

I know we all have these grandiose ideas about what GPUs should cost but the fact is that costs have gone up to make them, whether we like it or not.

Many said that Turing was totally taking the pee. And compared to Pascal? it was. However, what people failed to realise was the core size increased massively. And RT has added all sorts of complexion that again, leads to the dies needing to be huge.

That's the path Nvidia have taken, whether we like it or not. And they have to sell for a profit, which in their case is 60%. That won't change.

Remember, these price cuts you are seeing at the moment on all GPUs is not Nvidia. It is the board partners who bought the kits from Nvidia (die and VRAM) and are trying to clear their stocks. So it is them that is losing.

From what I understand Navi III is going to be cheaper to make. At the end of the day the smaller the die? the cheaper it is to produce. Both in terms of materials used and success VS failure rates. The more you get out of a wafer? the cheaper you can sell them. When you are relying on large monolithic dies? if one fails you lose a much larger chunk.

That is why historically on AMD's cheaper GPUs they were cut in a diamond shape. This is for cost reasons. You get more out of a circle wafer, with less of those cores going off the edge and into oblivion (wasted sand basically).

I think the "Maxwell moment" has a lot more meaning to it than just a huge performance uptick. I think it means AMD are able to make high clocking small dies that perform their balls off. That means they can sell them cheap, and Nvidia with their RT dreams will struggle to match their prices.

All remains to be seen of course, but if you don't give a stuff about RT because there's hardly anything to give a stuff for? then it could work out very lucrative for AMD.
 
Back
Top