Ray tracing will be required by AAA games in 2023

Yeah I think that's reasonable, it'll be a supported feature in most new hardware, including mainstream devices, from 2020, and almost certainly in most of the AAA games coming out "Holidays 2020". Meanwhile there's lots of optimisations you can do to help bring RT performance a little closer to traditional shading performance if you can abandon having supporting rendering certain elements traditionally ever(EG By switching to voxel modelling for all static geometry).

Hardware vendors will want to improve performance in the games coming out 2020 moving forward when in their "full" settings which almost certainly means the silicon allocation will shift towards RT over time as there's much more to be gained in that sphere(IE With a little maturity it'll be cheaper performance and development time wise to make fancy graphics gains by using the RT hardware than throwing exponentially more traditional shader grunt at ever more complex "tricks"), then the following games will want to make better use of the latest hardware by leaning more on all that RT allocation, and this repeats until traditional shader support is a mostly legacy consideration. It's a slow force feedback loop by nature but a pretty unstoppable one, it's essentially exactly the same path as during the switch from 2D hardware accelerators to 3D, which you could argue is the last time the graphics pipeline had a shift this large.
 
Last edited:
I'm not sure. Not all AAA games need RT. For example does Total War or really any strategy game need it? I don't think so. I think the big name titles probably will but I don't think it's going to be a necessity for most games unless the hardware advances so much that RT has as much of an impact as TXAA or something like that.
 
I'm not sure. Not all AAA games need RT. For example does Total War or really any strategy game need it? I don't think so. I think the big name titles probably will but I don't think it's going to be a necessity for most games unless the hardware advances so much that RT has as much of an impact as TXAA or something like that.

Big somewhat sparse open terrains are great for efficient RT if designed for it tbf, and would allow for relatively cheap realistic water, clouds & shadows with even basic RT use, and if that hardware is already available(In say the consoles for strat games on there) then leaning on that frees up more traditional shader power for other things too.

Like I think it will become an effective use of silicon thing with fixed hardware devices, at least after a few years of the consoles being around. Though only for those AAA console-port-ish games at that time.
 
Not sure about "requirement" but I can see a lot of devs using ray tracing "optionally" by that time.

This! I think 2023 is wishful thinking for when Ray tracing will be a requirement.

Since they use the term "requirement", which to me implies they think to even so much as run a AAA game after 2023 your card will need to support Ray tracing, rather then something like "it'll be a regular optional feature" I have to ask:
Do we really expect AMD, Nvidia, and Intel to have lower end hardware that has at least passable support for Ray tracing?

Personally I think the answer to that is no considering Nvidia's current mid range offerings struggle with Ray tracing and that is with it only being used for lighting and/or shadows and only using a few rays, where more would presumably create a more realistic picture (though presumably you eventually hit a point of diminishing returns).
 
Nah they said "the first AAA games to require raytracing will release in 2023" so the statement implies it will be rare even then. AMD said they wouldn't bring out raytracing till it can be across the stack, and we know they're bringing out raytracing designs at least for multiple consoles next year. We also know Intel's Xe architecture has raytracing & DXR(Gaming RT) hardware support, set for release next year. To be honest given even the GTX1660 has sort-of compute accelerated DXR fallback support I think we can assume all of NVidia's next gen cards will have at least basic DXR support for compatibility reasons too.
 
Last edited:
I'm more curious to see how current games will perform in 10 years time when graphics architecture has been optimised for RT and no longer has current levels of shader performance
 
I'm more curious to see how current games will perform in 10 years time when graphics architecture has been optimised for RT and no longer has current levels of shader performance

10 years time is a very long time at the rate things are developing. I can't even fathom what hardware will be like in 10 years, though intel might still be on 10nm+++++ haha.

Seriously though look at the way its gone, 10 years ago 4gb was a normality and considered the ideal for a gaming rig paired with a 260 (ish).
We'll be working in petabytes and petaflops by then, if not qbits, though I doubt quantum computing will be in homes in 10 years, though who knows
 
GPUs are a lot more generalised nowadays so a lot of the units useful for traditional shading have found lots of others uses too when they can be exposed properly, the bulk of the number crunching comes down to FMA vector FP units for instance, so while they might not see huge growth they're not going to start going backwards either. A part of AMDs first gen raytracing hardware optimisations is essentially repurposing the texture mapping units as raycasters as these are quite similar tasks for instance. Even once the "Hybrid era" of rendering for AAA videogames ends in a very very long time, a lot of the hardware grunt required to do shading reasonably will still be in there, just might take a bit of software to emulate a couple of things on the CPUs of that era, kinda like how things went with 2D acceleration basically.

Similarly quantum computing won't replace digital computing too, it's amazing at some things but it is inherently slower at many of the tasks and algorithms we used day to day, if it did come to homes it'd be with as yet unknown practical applications(Most likely encryption/security related though) and as a co-processor or something integrated into existing digital systems, though of course there'd be huge technical challenges particularly around how we think we'd have to cool those things to keep them accurate & stable to reach that point and it's decades off.
 
Back
Top