Nvidia reveals their Turing graphics architecture - Doubling Down on Ray Tracing

I doubt Turing will be used for the smaller designs; While they could get away with using binned version at the top end of their gaming lineup, this massive die size clearly won't be economical for most gaming cards, and that layout clearly doesn't prioritise performance on any game currently released or expected this year as a function of die size. I assume RTX technology wouldn't make sense on many lower end gaming cards this generation from an economy perspective. In theory they could release a "GTX" die around half the size while maintaining the same performance in almost every game out.
 
I doubt Turing will be used for the smaller designs; While they could get away with using binned version at the top end of their gaming lineup, this massive die size clearly won't be economical for most gaming cards, and that layout clearly doesn't prioritise performance on any game currently released or expected this year as a function of die size. I assume RTX technology wouldn't make sense on many lower end gaming cards this generation from an economy perspective. In theory they could release a "GTX" die around half the size while maintaining the same performance in almost every game out.

Hmm, maybe. Regardless, Nvidia will want to push Ray Tracing as hard as possible. If they can get a decent number of games to use it, AMD would be left in a terrible position. Nvidia has an incentive to get as many RTX compatible cards out there as they can.

RTX can be used with Microsoft's DXR (DirectX Ray Tracing) API, so they can even say that they are using an industry standard to get around the usual "proprietary standard" complaints. RTX isn't another PhysX, AMD could make a Ray Tracing Accelerating add-on if they wanted to (who knows how long that would take though).

Nvidia's Gross Margins are currently over 64.5%, which is insane for a company with no fabs. For perspective, Intel's Gross margin is 61.2% (they own their fabs) and AMD's are 37%. Nvidia can afford to waste a little die space.

I agree that the tech will be wasteful on low-end cards, perhaps 2050/ti grade or lower, but I expect GTX/RTX (or whatever) 2060-grade cards and over to have both Tensor and RTX cores. Nvidia needs to say that they have more than just compute performance, and if devs use that, they win.

AMD hasn't revealed their alternative to RTX, if they even have one, making every game that uses it an automatic win for Nvidia and a reason for people to upgrade. We already know that the next Metro will feature Ray Tracing elements.
 
That's a good read, thanks! Impressive numbers, let's see how it holds up in the wild. It's still in its infancy, rt for games, I feel.
 
I think NVidia have got very little chance of getting Ray Tracing into games as the game devs will see it as extra work that won't run on AMD GPUs or consoles.

So basically we will see the same small performance increases in the next round of gaming GPUs.

I own a GPU that can do Ray Tracing, I got it 9 months ago and I don't think I will see any games that will run it anytime soon.
 
I think NVidia have got very little chance of getting Ray Tracing into games as the game devs will see it as extra work that won't run on AMD GPUs or consoles.

So basically we will see the same small performance increases in the next round of gaming GPUs.

I own a GPU that can do Ray Tracing, I got it 9 months ago and I don't think I will see any games that will run it anytime soon.

The soonest game that I know is is Metro Exodus, which will release in February 2019.

https://www.youtube.com/watch?v=2NsdM1VS5u8
 
Last edited:
The soonest game that I know is is Metro Exodus, which will release in February 2019.

Sadly I think Ray Tracing will go the same way as DX12+mGPU, it will be included if the hardware vendors give the game devs incentives to use it otherwise it is just an added expense.
 
RT is just another tick on the marketing check list and the good little sheep will fall for it and buy them up.
 
Last edited:
Radeon Rays (Formerly FireRays) does provide real time ray tracing support on GCN hardware via OpenCL/Vulkan for developer products atm and Microsoft stated they worked closely with both AMD and NVidia for the creation of DXR, and AMD has stated they will be supporting realtime ray tracing in games so I don't think there's much doubt Navi will have DXR acceleration.

https://translate.google.com/transl...gs-im-grafikmenue-1803-133464.html&edit-text=

Vega is already fairly potent with raytracing algorithms and denoising algorithms(What NVidias Tensor cores are used for here) so it would likely come through further extending the instruction set of Vega's NCUs than through dedicated additional cores.

But yeah, RTX2060/RX670 tier card is the very lowest I'd expect to see this technology, and even then I think NVidia would prioritise die space towards the shader count much more than with the top-end Turing die.
 
Last edited:
RT is just another tick on the marketing check list and the good little sheep will fall for it and buy them up.

Yup, like this Dice guy who's already planning to buy a new GPU that isn't even out yet even though he has more than enough GPU power already. Sucker. :p
 
The soonest game that I know is is Metro Exodus, which will release in February 2019.

I wonder how much it'll be used there, and how significant it is. I recall the first DX10 game which only had that snow guy's collar rendered with a new effect. I've been weary ever since. It'll be a good number of years before we see something substantial, I think.
 
I wonder how much it'll be used there, and how significant it is. I recall the first DX10 game which only had that snow guy's collar rendered with a new effect. I've been weary ever since. It'll be a good number of years before we see something substantial, I think.

As far as I am aware it is used for a form of Global Illumination, though traditional methods of GI will be available for other graphics hardware and consoles.

4A games loves to make games that are GPU cookers on PC, so it makes sense for them to adopt this. Their engine team are nuts.
 
As far as I am aware it is used for a form of Global Illumination, though traditional methods of GI will be available for other graphics hardware and consoles.

4A games loves to make games that are GPU cookers on PC, so it makes sense for them to adopt this. Their engine team are nuts.

But will we be able to clearly tell the difference and does it warrent its usage and a new card purchase...?

Yeah I remember the first metro... Mind boggling
 
Back
Top