WYP
News Guru
Is this the future of AMD's RDNA architecture?

Read more about Project Scarlett's use of "dedicated ray tracing cores".

Read more about Project Scarlett's use of "dedicated ray tracing cores".
Last edited:
Nope. They'll just move on to the next gimmick. First they tried to appropriate physics to run on their GPU only instead of the CPU as god had intended. Then they tried to appropriate adaptive sync and market it as something unique to their hardware and now they're trying the same with ray tracing.Once consoles get this Nvidia are screwed.
Nope. They'll just move on to the next gimmick. First they tried to appropriate physics to run on their GPU only instead of the CPU as god had intended. Then they tried to appropriate adaptive sync and market it as something unique to their hardware and now they're trying the same with ray tracing.
They know that it won't last. But gamers are dumb creatures who keep falling for Nvidia's scams and will continue to do so in the future. Nvidia even commits fraud every couple of months and no one seems to care. It is remarkable what they've been able to get away with.
Once consoles get this Nvidia are screwed.
I've mentioned over the past few months just how impressive I've found consoles to be, and it made a lot of sense as to why PC gaming felt so stale and held back. It was because of the huge progress made after the Xbox 360. Progress I had not witnessed.
I had a toss up over which format to buy BL 3 for and I ended up going with the Xb1x. Why? 4k 60 apparently. This is the first time in my life that I've ever deliberately chosen to buy a game on console rather than pc. It was the same price and I can sit on my sofa using my 65" TV. Why would I want it on PC?
I'm not joining the Epic hate campaign but if I can't have guaranteed cloud saves and am thus limited to one pc then I'm just going to buy it on console.
Any way I digress. Sorry.
As soon as RT comes along on consoles IMO the pc is going to be in big trouble. Nobody, not even those who bought them, likes the 20 series prices and this will just hammer that home.
Also, whilst I'm here ranting and moaning. I heard Epic games were charging the Devs far less than Steam. I thought this would be good for gaming but the prices have increased hugely.
Good pc games are now £50-£60. Yet there's no license fee. It was actually cheaper for me to buy for the 1x tbh.
This, IMO, was the main reason why pc gaming made a resurgence, but that will soon die out if the prices become a joke.
But yeah, fundamentally PC and the fixed home consoles use almost exactly the same hardware architecture/tech now, just with a different hierarchy, though the consoles hierarchy though does mean they will always be able to deliver better value for money with exactly the same tech(Albeit at the cost of modularity/upgradability as this is derived from the fact consoles can use large APUs with GDDR memory thanks to their soldered connections, sockets ruin this but most would agree are fundamental to enthusiast PC's, although discreet hardware will never be able to compete on value).
As for dumb? I'd say I'm anything but. That's why I'm back on consoles. Because that's where the real progress has been happening. Progress I find far more exciting than gimmicks.
You're fooling yourself man! Consoles doesn't do 4k 60fps they do 1080p that can dinamycally scale up to like 1600p when you're looking into the sky and then, that image is upscaled to 4k. You could notice the downgrade in IQ if you weren't sitting far away from the screen. You can achieve that on PC making use of AMD's or Nvidia's filter library, enabling the sharpen filter, then you can go to your game's settings, put it in like 75% rendering resolution and even up close to the monitor you'll have a hard time noticing the difference, you can go lower if you output it into your TV and then sit far away from it, you'll also not notice the difference the same way you don't when you're playing on console. That's upscaling. I personally don't usually like consoles because to keep framerates they also tend to give up on anti aliasing and I hate aliased edges. But yeah you can get same or even better performancee than consoles with low end PC by using the same tricks consoles do. The output is 4k in resolution but it was rendered in less than 4k and that's why it has good performance.
I think the drop in popularity in mGPU use that led to the drop in focus was a combination of the fact that truly large die GPUs had started to exist with Kepler(Titan) and GCN(Huwaii) so mGPU's "sensible price range" increased significantly beyond what most people spend on PCs(Combined with the removal of SLI headers on NVidia's mid end down cards) as well as all the research around 2012 on into frame timings and the issues of frame consistency and latency inherent to AFR setups that effectively killed its popularity for mid/low end cards as this is where those issues were exasperated, both these came before the current gen console launch and were already seen back then as the nail in the coffin for a fading technology(Though indeed there was more optimism regarding the viability of some alternatives to AFR for mGPU use back then, but these all seemed to fall short or require very specific games in practice).but it does seem a very strange coincidence that stuff like Crossfire and SLi support seemed to fall off a cliff when the Xbone and PS4 came out. Why? because they are X86, so they had to do 0 to get it running on PC, compared to before where they had to do the work.
I have to disagree with this one, when Intel and AMD bring gaming orientated raytracing support to GPUs it will be through the DXR API that they & NVidia developed exactly for this purpose and that all current raytracing games available use, Microsoft took extreme care with DXR to make it completely hardware agnostic, to the degree that you can run the API via CPU right now if you wanted, although you'd get a very slow slideshow in all but basic few polygon demos. There will need to be some new code paths without a doubt but this would be a fraction of the work of the full implementation and the similarities between Turing and RDNA now are quite various, as well as the overall approach to accelerating raytracing, and given the engineers leading Intel's GPU development it's unlikely their arch will be a world apart either, especially as it would hinder uptake so significantly with modern APIs fidelity.RTX is one tech available on a small amount of cards supplied by one manu. Once Intel enter the market RT will become even more of a hassle to make games for. And as I said, the last PC exclusive (because Control isn't even an exclusive even though it support RT) was Crysis and look how long ago that was.
You missed my point entirely.Only their gimmick doesn't work when it costs twice as much as a full gaming console.
Where's 3Dvision? And Physx? And etc? They're gone because they didn't catch on. (As money spinners and you can add Gsync to that also)
As for dumb? I'd say I'm anything but. That's why I'm back on consoles. Because that's where the real progress has been happening. Progress I find far more exciting than gimmicks.
You're fooling yourself man! Consoles doesn't do 4k 60fps they do 1080p that can dinamycally scale up to like 1600p when you're looking into the sky and then, that image is upscaled to 4k.
I personally don't usually like consoles because to keep framerates they also tend to give up on anti aliasing and I hate aliased edges.