Go Back   OC3D Forums > [OC3D] General Forums > OC3D News
Reply
 
Thread Tools Display Modes
 
  #1  
Old 29-07-19, 03:40 PM
WYP's Avatar
WYP WYP is offline
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 15,994
Ray tracing will be required by AAA games in 2023

Will Nvidia's guess hold true?



Read more about Ray tracing becoming a requirement for AAA games in 2023.

__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #2  
Old 29-07-19, 03:59 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,442
Yeah I think that's reasonable, it'll be a supported feature in most new hardware, including mainstream devices, from 2020, and almost certainly in most of the AAA games coming out "Holidays 2020". Meanwhile there's lots of optimisations you can do to help bring RT performance a little closer to traditional shading performance if you can abandon having supporting rendering certain elements traditionally ever(EG By switching to voxel modelling for all static geometry).

Hardware vendors will want to improve performance in the games coming out 2020 moving forward when in their "full" settings which almost certainly means the silicon allocation will shift towards RT over time as there's much more to be gained in that sphere(IE With a little maturity it'll be cheaper performance and development time wise to make fancy graphics gains by using the RT hardware than throwing exponentially more traditional shader grunt at ever more complex "tricks"), then the following games will want to make better use of the latest hardware by leaning more on all that RT allocation, and this repeats until traditional shader support is a mostly legacy consideration. It's a slow force feedback loop by nature but a pretty unstoppable one, it's essentially exactly the same path as during the switch from 2D hardware accelerators to 3D, which you could argue is the last time the graphics pipeline had a shift this large.
Reply With Quote
  #3  
Old 29-07-19, 04:37 PM
NeverBackDown NeverBackDown is online now
AMD Enthusiast
 
Join Date: Dec 2012
Location: With the Asguardians of the Galaxy
Posts: 16,283
I'm not sure. Not all AAA games need RT. For example does Total War or really any strategy game need it? I don't think so. I think the big name titles probably will but I don't think it's going to be a necessity for most games unless the hardware advances so much that RT has as much of an impact as TXAA or something like that.
__________________
I am Iron Man.
Reply With Quote
  #4  
Old 29-07-19, 04:39 PM
Bartacus Bartacus is offline
OC3D Elite
 
Join Date: Apr 2013
Location: Ottawa, ON, Canada
Posts: 2,316
Not sure about "requirement" but I can see a lot of devs using ray tracing "optionally" by that time.
__________________
Guts: Ryzen 3900X / GB Aorus Master / 16GB TridentZ 3600CL15 / Zotac GTX 1080Ti x2 / EVGA 1000W PSU / Case Labs M8
Storage: Corsair Force MP600 1TB NVME / 6TB RAID0 HDD array / 4TB RAID0 SSD array
Water Cooling: Watercool HeatKiller IV blocks on CPU & GPUs / 4 360 rads + 240 rad / Heatkiller reservoir / dual D5 pumps

Reply With Quote
  #5  
Old 29-07-19, 04:57 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,442
Quote:
Originally Posted by NeverBackDown View Post
I'm not sure. Not all AAA games need RT. For example does Total War or really any strategy game need it? I don't think so. I think the big name titles probably will but I don't think it's going to be a necessity for most games unless the hardware advances so much that RT has as much of an impact as TXAA or something like that.
Big somewhat sparse open terrains are great for efficient RT if designed for it tbf, and would allow for relatively cheap realistic water, clouds & shadows with even basic RT use, and if that hardware is already available(In say the consoles for strat games on there) then leaning on that frees up more traditional shader power for other things too.

Like I think it will become an effective use of silicon thing with fixed hardware devices, at least after a few years of the consoles being around. Though only for those AAA console-port-ish games at that time.
Reply With Quote
  #6  
Old 29-07-19, 06:08 PM
Kourgath223's Avatar
Kourgath223 Kourgath223 is offline
Member
 
Join Date: Aug 2016
Location: United States
Posts: 102
Quote:
Originally Posted by Bartacus View Post
Not sure about "requirement" but I can see a lot of devs using ray tracing "optionally" by that time.
This! I think 2023 is wishful thinking for when Ray tracing will be a requirement.

Since they use the term "requirement", which to me implies they think to even so much as run a AAA game after 2023 your card will need to support Ray tracing, rather then something like "it'll be a regular optional feature" I have to ask:
Do we really expect AMD, Nvidia, and Intel to have lower end hardware that has at least passable support for Ray tracing?

Personally I think the answer to that is no considering Nvidia's current mid range offerings struggle with Ray tracing and that is with it only being used for lighting and/or shadows and only using a few rays, where more would presumably create a more realistic picture (though presumably you eventually hit a point of diminishing returns).
__________________
Reply With Quote
  #7  
Old 29-07-19, 06:12 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,442
Nah they said "the first AAA games to require raytracing will release in 2023" so the statement implies it will be rare even then. AMD said they wouldn't bring out raytracing till it can be across the stack, and we know they're bringing out raytracing designs at least for multiple consoles next year. We also know Intel's Xe architecture has raytracing & DXR(Gaming RT) hardware support, set for release next year. To be honest given even the GTX1660 has sort-of compute accelerated DXR fallback support I think we can assume all of NVidia's next gen cards will have at least basic DXR support for compatibility reasons too.
Reply With Quote
  #8  
Old 30-07-19, 02:28 AM
Kleptobot's Avatar
Kleptobot Kleptobot is offline
Member
 
Join Date: Dec 2012
Location: Melbourne Australia
Posts: 246
I'm more curious to see how current games will perform in 10 years time when graphics architecture has been optimised for RT and no longer has current levels of shader performance
__________________
delided 3570k (watercooled)
hd7970 (watercooled)
8GB Gskill XM
3*140 SR1 Rad
all stuffed in an FD arc midi
Reply With Quote
  #9  
Old 30-07-19, 12:47 PM
demonking's Avatar
demonking demonking is offline
OC3D Crew
 
Join Date: Sep 2011
Location: UK
Posts: 894
Quote:
Originally Posted by Kleptobot View Post
I'm more curious to see how current games will perform in 10 years time when graphics architecture has been optimised for RT and no longer has current levels of shader performance
10 years time is a very long time at the rate things are developing. I can't even fathom what hardware will be like in 10 years, though intel might still be on 10nm+++++ haha.

Seriously though look at the way its gone, 10 years ago 4gb was a normality and considered the ideal for a gaming rig paired with a 260 (ish).
We'll be working in petabytes and petaflops by then, if not qbits, though I doubt quantum computing will be in homes in 10 years, though who knows
Reply With Quote
  #10  
Old 30-07-19, 01:39 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,442
GPUs are a lot more generalised nowadays so a lot of the units useful for traditional shading have found lots of others uses too when they can be exposed properly, the bulk of the number crunching comes down to FMA vector FP units for instance, so while they might not see huge growth they're not going to start going backwards either. A part of AMDs first gen raytracing hardware optimisations is essentially repurposing the texture mapping units as raycasters as these are quite similar tasks for instance. Even once the "Hybrid era" of rendering for AAA videogames ends in a very very long time, a lot of the hardware grunt required to do shading reasonably will still be in there, just might take a bit of software to emulate a couple of things on the CPUs of that era, kinda like how things went with 2D acceleration basically.

Similarly quantum computing won't replace digital computing too, it's amazing at some things but it is inherently slower at many of the tasks and algorithms we used day to day, if it did come to homes it'd be with as yet unknown practical applications(Most likely encryption/security related though) and as a co-processor or something integrated into existing digital systems, though of course there'd be huge technical challenges particularly around how we think we'd have to cool those things to keep them accurate & stable to reach that point and it's decades off.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 10:41 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.