Nvidia promises a brighter future for DLSS - New techniques will deliver more detail

The issue Nvidia have is that the advantage of deep learning is that the algo should be calculated elsewhere, offline and NOT in real-time, ready to be dropped into much lighter hardware.

If DLSS was designed correctly, it should be a case that patches significantly improve performance on the same GPU, but if it's hardware constrained, then Nvidia merely have given the customer another sales reason as to why they need to upgrade their GPU every ~12 months.
 
These GPUs are those much lighter pieces of hardware, the models are trained on the DGX SaturnV supercomputer.
 
Would be cool if Nvidia released a behind the scenes video showing us how the deep learning is done on their super computers and then how it's ported over to consumer grade hardware.
 
Would be cool if Nvidia released a behind the scenes video showing us how the deep learning is done on their super computers and then how it's ported over to consumer grade hardware.

They wouldn't do that. Could give away too many industry secrets. Nvidia would be wise to not seed any of its methodologies to the competition.
 
They wouldn't do that. Could give away too many industry secrets. Nvidia would be wise to not seed any of its methodologies to the competition.

Well I don't mean indepth stuff but a cool little walkthrough like -

"This is where the bulk of our deep learning is done... here is where it gets ported to consumer hardware"

Stuff like that, Obviously not showing us code and techniques just a small tour of the facility :)
 
Dr Pound has some great videos on neural networks, particularly in relation to image manipulation, if you wanted a light overview on the tech at use here, but essentially the bit on the supercomputer is called a training algorithm, the bit on your GPU is an inference algorithm, training is the process of adjusting the weights(Thrown an image at the bottom, weights are the "strength" of the links between the circles) to get your desired output value from a typical input value(The algorithm itself adjusting the network will be Error Back Propagation here), inference is just the process of putting an input in your adjusted network of weights to see what the output is, there's no algorithm at all per-se, it's just a load of adding and multiplying weights in a simple manner across the network, the mix of weights(a number that dictates the output of a single path on the network) and how they link to each other dictates the functionality(This is roughly how neurons work in your brain, fundamentally quite an abstract and analogue process so still more the domain of electrical engineering than computer science atm, not too popular amongst traditional programmers as the maths is closer to semiconductor physics stuff so a lack of coding experience is no real barrier):

Deep Learned Super-Sampling (DLSS): https://www.youtube.com/watch?v=_DPRt3AcUEY
Inside a Neural Network: https://www.youtube.com/watch?v=BFdMrDOx_CM
How Blurs & Filters Work: https://www.youtube.com/watch?v=C_zFhWdM4ic
And separately
Inside SATURNV – Insights from NVIDIA’s Deep Learning Supercomputer https://www.youtube.com/watch?v=7Apoj2o7lXA

IJAREEIE-1293-g001.gif
 
Last edited:
Well I don't mean indepth stuff but a cool little walkthrough like -

"This is where the bulk of our deep learning is done... here is where it gets ported to consumer hardware"

Stuff like that, Obviously not showing us code and techniques just a small tour of the facility :)

A lot of information can be gleaned from even the simplest of glances.

I remember Microsoft showing a render of their Xbox One X SOC and from that we knew the approximate die size, memory configuration and other notable details. There is a reason why Xbox hasn't given us a similar glance this time.

When Nvidia is in the lead, they have little reason to risk extra information getting out. I remember certain architectural enhancements of Maxwell not being spoken about by Nvidia until years afterwards, just so that their competition didn't get a hint at their secred sauce.
 
Back
Top