3DMARK results for AMD's rumoured RX 5600 XT have appeared online

Hoping we eventually get results for AMD's BIG Navi card ^_^

Yeah, I want to see a BIG Navi card. Better still, I want to see a big RDNA 2 card. With Hardware RT and VRS it could really make Nvidia look bad. Then again, Nvidia only needs to move to 7nm to get a big performance boost.
 
That's impressive. And Time Spy doesn't always favour AMD. Vega 56 is generally about on par with a 1070Ti in games, which is about 12-15% faster than both the 1660 Super and 1660Ti. If these come in at a good price (I'd say $230-250), they're going to do well. I imagine they're going to be $250-280 though.
 
That's impressive. And Time Spy doesn't always favour AMD. Vega 56 is generally about on par with a 1070Ti in games, which is about 12-15% faster than both the 1660 Super and 1660Ti. If these come in at a good price (I'd say $230-250), they're going to do well. I imagine they're going to be $250-280 though.

It doesn't favor AMD because in last 10 years it was coded specifically for Nvidia. No one is breaking world records with AMD cards. Their driver team also needs to step up.

I really hope there will be a BIG AMD card. It would change so many things for the better. But they need to have a proper release and not make a mess of it like they did with every Ryzen launch, Vega, Radeon 7, etc...
 
AMD don't need huge cards. They never make money from them.

All they need to do is just continue with what they are doing now.

Make consoles which devs will support (they have no choice unless they want to lose access to millions of dollars in sales) and continue on making these devs get the games to target the hardware.

4k 30 solid is now a given on the XB1X. Next mission? 4k 60 in the next gen. Maybe throw in a bit of RT? Not sure yet if that'll be the Xbox sex and PS5 or whether it'll be the gen after.

But fact is nothing more than matching hardware should be needed, plus 20% to make up the pc difference.

Personally I think AMD are in a much better position than Nvidia to dictate now. Both of the big consoles. That's 2-1 and that's assuming Nvidia are in every gaming pc on the market which of course they aren't.

RTX will die out. I can assure that. TBH it's not even very good. The best display of RT I've yet seen is that unreal demo. It's amazing, and ran better on the Vega 64 than my 2070s. And that's telling about the core design. I think Nvidia need to stop lopping bits off their cards and sell it as something and concentrate on whatever AMD decide to use in the consoles. Or they'll die out, unless they can find some other market to tap into.

Sure Nvidia are holding all the cash right now but guess what? AMD hold the cards. Just like they did with Ryzen.

Consoles are also becoming more like pcs with every gen, too, and eventually people will realise how much they're paying to build a pc in parts over buying this small box that basically does it all and costs way less to do for price to perf.

Most people jumped to pc gaming a few years ago to dodge high priced console games. But in the last two years they're now the same pride and loaded with bugs and crap to get them working.

Consoles have gotten so good with the pro and 1x that the next gen actually excites me more than pc gaming. Mostly because for the whole time DX12 has been out all of the improvements have been happening on consoles.

RDR2 runs solid 4k 30 on XB1X and upscales at times to almost 6k. That's actually sick for the power it has, and makes the mind boggle as to how the next gen consoles will be.

PC? Incremental boring stale overpriced market. Where's the fun in that?
 
I agree with most things that you have said Alien. AMD doesn't need 2080 Ti killer but they need something that will compete with 2070 Super. They are in good position with consoles but that doesn't generate a large profit margin. They need to be competitive in GPU market. Both in gaming and production.

What I think you are wrong about is RT. It is not going to die. It will be the future of games. Hardware just didn't catch up with requirements to run it. Nvidia 3000 cards will be much more optimized for RT with many more RT cores. Next gen AMD cards will have RT and consoles will support it. It is here to stay. And it has so many advantages over current "cheating" techniques.

What will happen is that consoles will have limited RT support. Pretty much how it is on PC today. Few reflections here and there, maybe some shadows and it will run in 4K-ish resolution at 30-60 FPS. On the other side the PC will have full featured RT effects with all the bells and whistles just because of the raw horse power.

Take Red Dead Redemption 2 as a present day example of what will happen with RT in a year or two. On consoles it runs with effects on the lowest PC settings and some are even lower than you can get them on PC. On the PC version of the game you can crank them up to 11 and give your computer a hart attack.

Edit: It is not just the games that will skyrocket RT. Nvidia OptiX API is used for accelerating Cycles in Blender on RTX hardware.
 
Last edited:
Yeah now we know RT will be in almost every major console and graphics card launch from 2020 onwards its clear what direction the industries taking. Its orders of magnitude less work for devs to use RT for things like reflections than the old methods, and now they'll actually be accurate. RT is the only way we can begin to close the uncanny valley, it's the only way we could do accurate skin subsurface reflections and such to stop humans looking plastic, there are lots of subtle ways to use RT that hasn't been explored in gaming yet.
 
Last edited:
AMD have cards that compete with the 2070 Super now.

Not in everything, but as I pointed out above it won't be long before we make the full switchover to console engines and they'll perform just as they do now in DX12, better than Nvidia.

With each console generation Nvidia are losing their grip. Mostly because like I said, AMD hold all the cards.

If the Vega 64 I had beats the 2070s in that unreal benchmark how does that bode for Nvidia when their RT hardware sat stagnant?

In other words what do you think has more chance.

1. RT that runs on every system Inc AMD GPUs and the two new consoles.

2. Specific PCs with specific hardware on a 1 out of 2 choice (soon to be three with Intel entering the market).

It's pretty obvious what's going to win really isn't it?

This gen of consoles will bring almost absolute parity with the PC.

Nvidia must be crapping themselves.
 
The way the DirectX12 DXR extension layer(Used for almost every "RTX" game so far, and was developed with AMD and Intel too) works means that the companies could take quite varied hardware approaches, GPU manufacturers wrap the low level code up in drivers and these are exposed in a fairly hardware agnostic way to game developers(Though obviously individual code paths for different archs is still better for optimisation), so I think it's likely both companies will be bringing strong evolutions in their RT units and the scope of them with each generation themselves for quite a bit anyway, as RTRT ASICs are far from a mature area of development like traditional GPUs which hit the "Accelerator wall" that most ASICs hit with IPC progression a long time ago.
 
AMD have cards that compete with the 2070 Super now.

Not in everything, but as I pointed out above it won't be long before we make the full switchover to console engines and they'll perform just as they do now in DX12, better than Nvidia.

With each console generation Nvidia are losing their grip. Mostly because like I said, AMD hold all the cards.

If the Vega 64 I had beats the 2070s in that unreal benchmark how does that bode for Nvidia when their RT hardware sat stagnant?

In other words what do you think has more chance.

1. RT that runs on every system Inc AMD GPUs and the two new consoles.

2. Specific PCs with specific hardware on a 1 out of 2 choice (soon to be three with Intel entering the market).

It's pretty obvious what's going to win really isn't it?

This gen of consoles will bring almost absolute parity with the PC.

Nvidia must be crapping themselves.

You got it wrong mate. RT isn't Nvidia exclusive like PhysX was. It is option 1 for all. DXR or RT on Vulcan is the same as any platform before (DX9, 10, 11, 12). It just runs better on 2080 Ti than it runs on 5700XT. You won't need Nvidia specific hardware. Nvidia was just the first to make cards that can run DXR.

I gave you the example on Red Dead Redemption 2. Any game today runs better on 2080 Ti than it runs on any console. It will probably be the same in the future. The only difference is that it will be coded in DXR instead of DX11.

Nvidia isn't crapping themselves. They are laughing their asses off and stretching their wallets because they have working hardware, drivers, SDKs and they are just perfecting it while AMD and Intel have maybe a few barely working prototypes in the lab.

Sadly... RT could probably make Nvidia even more powerful than they are now.

Edit: And also a very important thing... Even though DriectX is an open standard every single line of code that was ever written for RT was done on Nvidia's hardware, with Nvidia's drivers, with the help from Nvidia, with Nvidia's money. And then comes AMD to let's say Unreal devs and says optimize for our hardware that isn't that strong and drivers aren't that stable, and we can't give you that much money.
 
Last edited:
And then comes AMD to let's say Unreal devs and says optimize for our hardware that isn't that strong and drivers aren't that stable, and we can't give you that much money.
*But you need to anyway if you want access to the home console market. Seems like neither company has much reason to worry financially about RT, GPU progression had slowed to a hog over the last 5 years anyway.
 
*But you need to anyway if you want access to the home console market. Seems like neither company has much reason to worry financially about RT, GPU progression had slowed to a hog over the last 5 years anyway.

Yes they will be optimized but not to a point where RTX is ignored like Alien suggested. Games will run great for a time and then they will do what they do today lower the details, render on lower res then upscale and say "See it runs on 4K." That won't touch Nvidia.

It has slowed down because no one would dare to release a game that doesn't run on consoles. But i think we will have much bigger performance jumps from generation to generation of GPUs because RT has just started evolving.
 
Yes they will be optimized but not to a point where RTX is ignored like Alien suggested. Games will run great for a time and then they will do what they do today lower the details, render on lower res then upscale and say "See it runs on 4K." That won't touch Nvidia.

It has slowed down because no one would dare to release a game that doesn't run on consoles. But i think we will have much bigger performance jumps from generation to generation of GPUs because RT has just started evolving.

In 99% of the things you do and play with your GPU the tensor cores do absolutely nothing. In the UE demo they do absolutely nothing. In fact, when that demo utilises GCN it provides better results than a 2070s because the tensor cores are sat doing nothing.

Nvidia showed us all how crap our 10 series cards were for RT in the method that requires tensor cores but I think you missed my point.

Whatever method is chosen for RT on consoles I can guarantee you one thing. That it will work on both platforms at once and it will work on AMD. That's not assumption that will be facts as there is no other way. The consoles will be using Navi.

So, let's say they use that UE right? That means tensor cores will be sat doing nothing as I proved when benching my Vega 64 against my 2070s and it's also why the 2070s lost.

AMD make tank GPUS with tons of GCN units which until DX12 were a waste of time. However they knew years ago what was going into consoles and they also knew which APIs and so on they would use.

That means they'll also know what RT they'll use and I bet it will be a method that shows up the problem of putting dedicated crap on your gpus. If no one codes for it it doesn't do anything.

If you really think Nvidia are not quietly sh****ng themselves with yet another huge player about to enter the arena to steal their sales, flops in the electric car sector and nothing else but the Nintendo switch then you're nuts.

The next gen of consoles will bring full parity to pcs. Maybe not at ridiculous levels of detail or etc but much much closer than now. What you also need to remember is that you could probably buy the top spec Xbox sex and PS5 for the same price as whatever stupid prices Nvidia will be charging for their top end card.

I promise I won't remind you of yet another Shield flop. Honest.
 
In 99% of the things you do and play with your GPU the tensor cores do absolutely nothing. In the UE demo they do absolutely nothing. In fact, when that demo utilises GCN it provides better results than a 2070s because the tensor cores are sat doing nothing.

Nvidia showed us all how crap our 10 series cards were for RT in the method that requires tensor cores but I think you missed my point.

Whatever method is chosen for RT on consoles I can guarantee you one thing. That it will work on both platforms at once and it will work on AMD. That's not assumption that will be facts as there is no other way. The consoles will be using Navi.

So, let's say they use that UE right? That means tensor cores will be sat doing nothing as I proved when benching my Vega 64 against my 2070s and it's also why the 2070s lost.

AMD make tank GPUS with tons of GCN units which until DX12 were a waste of time. However they knew years ago what was going into consoles and they also knew which APIs and so on they would use.

That means they'll also know what RT they'll use and I bet it will be a method that shows up the problem of putting dedicated crap on your gpus. If no one codes for it it doesn't do anything.

If you really think Nvidia are not quietly sh****ng themselves with yet another huge player about to enter the arena to steal their sales, flops in the electric car sector and nothing else but the Nintendo switch then you're nuts.

The next gen of consoles will bring full parity to pcs. Maybe not at ridiculous levels of detail or etc but much much closer than now. What you also need to remember is that you could probably buy the top spec Xbox sex and PS5 for the same price as whatever stupid prices Nvidia will be charging for their top end card.

I promise I won't remind you of yet another Shield flop. Honest.

I think you are mixing up apples and oranges. Tensor Cores are used for deep learning. They are there because Turing is re-purposed from Volta. Nvidia tried to use them for DLSS but didn't quite succeed. And you are right in 99% of the time they do absolutely nothing at all.

Real-Time Ray Tracing Acceleration requires dedicated hardware. In Nvidia's case RT Cores. They have enabled RT on CUDA mostly because of developers who don't need dedicated hardware to test the feature. It is possible to emulate RT on standard CUDA and Stream Processors but it is nowhere near as effective as it is on dedicated hardware.

Nextgen AMD GPU architecture that will support RT and go in consoles will have dedicated hardware for running RT instructions.

Don't get me wrong. I like consoles. But I don't think you can run same games on both platforms. There are games that are suitable for consoles, and others that are suitable for PC. They shouldn't be compared or mixed.

I don't think new consoles would be the downfall of Nvidia or PC. Actually the new consoles could possibly make a bigger gap between two platforms. Game engines have pretty much reached the limit of what is possible with Rasterization and "cheats" to achieve visuals. With consoles bringing RT to everyone game graphics will evolve so fast. We may even see double performance in RT from hardware and software optimizations with each generation of GPUs form AMD, Nvidia, or Intel. And if consoles keep their 7 years renewal cycle... Yea...
 
Every pc game I've played in the past 3 years has been a console game in drag.

The only reason that maybe you don't know this is because you don't have consoles? IDK.

Most pc games have hardly anything in the menu now with regards to settings and nearly all show you a big long intro before you can even access those settings. The reasons are simple, they're console games.

All sorts of tricks have been added to make aiming easier on a controller and all of this is in the pc version, natch.

So as far as games go they are all console games and have been since consoles went X86.

Maybe you should pick up an Xbox 1x and see where all of the progression has been going on for the past 5 years because it wasn't on pc.

Yup, consoles really are that good now.
 
AMDs patents for hardware RT acceleration units were filed in 2017 after NVidia, Intel, AMD and Microsoft had worked together to formalise the DXR spec to ensure all their software was cross compatible.

I think you're mixing up UE (Which uses DXR/ hardware accelerated RT) and the Cryengine demo that had some RT tricks that could work without hardware, but only for super simple barely animated scenes with a tiny render distance and only one or two moving objects. Practically, any game that relies meaningfully on RT will need hardware accel. still.

AMDs method of acceleration, which they have detailed extensively, would not work on current 5000 series cards. It requires dedicated RT hardware within the texture units for BVH acceleration, besides a complete rework of the cache hierarchy amongst other things.

You're right about consoles though, they have meant GCN has become amazingly well optimised on PC, I still play new games fine at 1080p on a HD7870XT that was £120 in 2012, thanks to the consoles. And next gen RT will one day too be incredibly optimised thanks to devs squeezing the most out of RDNA2's harward RT units.
 
Last edited:
I am yeah. Doh /facepalm.

That's what happens when you're rushing around getting ready to leave for Christmas and posting on a forum at the same time.

See you all in the new year :D
 
Back
Top