DXR Ray Tracing Support is coming to Pascal and GTX Turing

Great news!

This hopefully drives the tech to more developers and the performance impact will be interesting though.
Maybe my 1080ti can equate to a rtx 2070 with raytracing on!
 
While it depends a lot on the implementation I think Pascal's 1.5x to 3x hit in games so far from enabling DXR means it's still kinda useless below a 1080Ti besides demonstrating to people why they didn't open it up originally, at least until games start implementing lighter and lighter DXR implementations which to be fair should come as the tech matures and people work out where the "bang for buck" lies in terms of visual impact vs performance impact, but then that will also come alongside making somewhat better use of the high resource raytracing elements so there will presumably still be a pretty clear gap in the type and amount of effects that are capable on accelerated hardware.

I think this also makes it pretty likely AMD went the route of making their shaders better at the compute workloads used in these semi-software based DXR layers as opposed to creating dedicated execution units for it, really they had to wait until the tech matured somewhat and lighter implementations started to come until that was a viable route(And I wouldn't be mega surprised if Navi launched with these features coming later in a driver update, Radeon VII now has reasonably well accelerated DXR fallback layer support but its clear work still needs to be done on the software side).
 
Last edited:
While it depends a lot on the implementation I think Pascal's 1.5x to 3x hit in games so far from enabling DXR means it's still kinda useless below a 1080Ti besides demonstrating to people why they didn't open it up originally, at least until games start implementing lighter and lighter DXR implementations which to be fair should come as the tech matures and people work out where the "bang for buck" lies in terms of visual impact vs performance impact, but then that will also come alongside making somewhat better use of the high resource raytracing elements so there will presumably still be a pretty clear gap in the type and amount of effects that are capable on accelerated hardware.

I think this also makes it pretty likely AMD went the route of making their shaders better at the compute workloads used in these semi-software based DXR layers as opposed to creating dedicated execution units for it, really they had to wait until the tech matured somewhat and lighter implementations started to come until that was a viable route(And I wouldn't be mega surprised if Navi launched with these features coming later in a driver update, Radeon VII now has reasonably well accelerated DXR fallback layer support but its clear work still needs to be done on the software side).

It would be nice,for once, for a GPU to be pushed to its absolute limit before its tossed in the bin. RT, imo, should be learning to walk before it runs. If Pascal can do RT it should do it for as long as it can before it's literally on its proverbial knees.

But it never happens.

The XBOX 360 was pushed beyond its limits right til the very end. Would be nice if pc hardware was like that.
 
But if you tried to use Pascal for early implementations the result would be so far from realtime you'd barely be able to assess it, there's a reason why devkits are often more powerful than the release consoles, it helps to be able to get your early builds of games to something workable if its playable with early unoptimised assets and settings that can't really be optimised on a wide scale until you have something playable. It's a chicken and egg situation, if RT launched on Pascal back when BFV was the only game with it a couple months ago the result would have been mostly the same for its usability besides on a TitanV maybe, and its still only really usable in BFV's now well optimised and fairly light version of RTX compared to say Metro Exodus.
 
If I would have bought a Turing card, I would be spitting feathers at this news


Yeah this is kind of kick in the bollocks but because i had to RMA my newly bought 1080 last october and the 2070 was only 20 more i went for that instead, which given that i sometimes stream which Nvenc is amazing for, its not totally a train wreck on my end.


Buuut yeah it still is a bit of a wtf nvidia, however i'm not entirely surprised either.
 
Back
Top