World of Tanks' DirectX 11 raytracing solution is more impressive than you think

Looks cool, a subtle way of doing it. A lot of games using the RTX features seem to have made it ALL about the raytracing. Millions of god rays, every slightly reflective surface is like a mirror ect.
Its only a matter of time before having dedicated RT is redundant and its all software.
 
Its only a matter of time before having dedicated RT is redundant and its all software.
Doing the BVH tree construction on a CPU like this does still take a lot of work off the GPU(Essentially allowing you to skip some pointless tasks) but I don't think you could use it beyond the scale of hundreds of thousands of rays practically, having hardware acceleration on this stage can increase that to billions of rays atm so it's a really stark difference. They've done very well to optimise their implementation to a point where those few hundred K of rays can make a noticeable difference but this is a super simplified specific implementation(IE 1 ray/pixel shadows just on tanks with some filters to clean up the image).

Still this is a nice example of how very simple use of RT can create 100% accurate effects at a very low cost in games I think. A sign of the tricks to come when the consoles get hardware acceleration and an arms race begins on who can squeeze the most out of the hardware.

[This makes me think though, maybe Intel arn't going to have BVH construction acceleration units as part of their upcoming 1st gen GPUs after all, would put more load on the CPUs but could still work in games with RT shading acceleration which they seem to have already indicated is on board. I'm sure they wouldn't see a another little nudge in games CPU performance requirements as a particularly bad thing, they could use this method via a DXR layer too]
 
Last edited:
[This makes me think though, maybe Intel arn't going to have BVH construction acceleration units as part of their upcoming 1st gen GPUs after all, would put more load on the CPUs but could still work in games with RT shading acceleration which they seem to have already indicated is on board. I'm sure they wouldn't see a another little nudge in games CPU performance requirements as a particularly bad thing, they could use this method via a DXR layer too]

You could be partly right here, Intel have been big to say that they want to cover the entire GPU range eventually so the potential to have 2 different brands may well exist. We could see the low/mid range being targeted as purely rasterised to compete against the 16 series/XT type and another range with the additional hardware to compete against the 20 series and beyond. I'm hoping that Intel don't go balls to the wall on their first gen and put a big focus on the "tock" side this time round, just don't give us tocks for the next 5 years lol.

On the whole RTRT subject. There is no argument that it will one day either be all a software overhead or the GPU's will evolve to integrate this work load on the fly within their main processor. I'm sure Nvidia will want this, its cheaper to make software than it is hardware. Dedicated hardware is always the first iteration of anything like this until it is refined and integrated. Take nearly anything like this, basic GPU's are now a norm in CPU's, PHYSX needed an Ageia card ect. This could well be years down the line or see what AMD has up their sleeve as they see a lot in cloud processing for this type of work load, which I suppose provided you have a good connection to the server is a good way of doing it, depending how they charge for it of course....
 
Back
Top