Nvidia showcases huge performance gains in The Lord of the Rings: Gollum with DLSS

Watched a video with a bit of gameplay and the graphics are mediocre at best yet to get anything over 40FPS at 4K you need a 4090, Granted that's before DLSS2+3 but graphically it looks pants yet needs a massive amount of horsepower.
 
Watched a video with a bit of gameplay and the graphics are mediocre at best yet to get anything over 40FPS at 4K you need a 4090, Granted that's before DLSS2+3 but graphically it looks pants yet needs a massive amount of horsepower.

Yeah that's an example of a game that clearly just wasn't well optimized by the developer, I hope this sickness of relying on DLSS to get playable framerates on high-end GPUs doesn't spread too much, but certainly some Devs are doing this.
 
I'm beginning to sound like a conspiracy theorist, but I wouldn't be surprised if Nvidia was paying devs and publishers not to optimize their games in order to make DLSS look more appealing.
 
I'm beginning to sound like a conspiracy theorist, but I wouldn't be surprised if Nvidia was paying devs and publishers not to optimize their games in order to make DLSS look more appealing.

No devs are paying them.
Why do any optimisation at all when you can just get nvidia to run it through their ai machine and then anyone on a non nvidia can just suffer and make them look better

Since physical media died, space was your problem. Since gpu makers started putting more vram on the cards some some people have ram, having the resources to run a game (if you're on pc) is your problem.

Gaming is no longer being overseen by gamers and artists, it's being run by accountants and lawyers, who's only concern is generating market cap. Therfore those that can't afford to keep up, nevermind, get a mobile game with a billion in app purchases and leave the real gaming to those that can afford it
 
I'm beginning to sound like a conspiracy theorist, but I wouldn't be surprised if Nvidia was paying devs and publishers not to optimize their games in order to make DLSS look more appealing.

It never ceases to amaze me how many people have those theories when companies push gaming to a new level. Face it, the dev cycle is up, and it is what it is. No one complained when the original Crysis made their GPU take a poop and fall on its face.

You can't have it both ways. Either games get to look better, or you stay on the older games and have the performance. The fact is? Nvidia and AMD have not been pushing very hard at all on raster performance. I repeat - neither of them are stupid. The days of buying, for example, a 1080Ti and it lasting you years are over. Why do you think Nvidia wanted to do ray tracing so bad? because they knew it would be over a decade before any of their GPUs can run stuff like that properly. It was reinventing the wheel, and putting us back years and years to create more sales for them.

These games are not poorly optimised. They are made using the tools devs have, and nothing more. I mean, let's take the recent "The Last Of Us" debacle for example. They lovingly recreate a game, make it look absolutely incredible, then the little P1ss ants all whine that their 3060 doesn't have enough VRAM and they can't run it at max settings. So what do they do to "optimise and fix" it? they compress all of the textures. Which makes it look like anus, and nowhere near as good as it looked before.

Don't blame people for pushing the envelope with games. Blame the real culprits. Nvidia could have made the 4090 WAY bigger and far more powerful. They just didn't want you to have it. Same with everything lower down the stack too. They are completely lame for the technology they are on. All of them should have been bigger, badder, with more VRAM and much bigger muscles. Yet Nvidia are just too greedy and squeaky to give you any of them.
 
It never ceases to amaze me how many people have those theories when companies push gaming to a new level. Face it, the dev cycle is up, and it is what it is. No one complained when the original Crysis made their GPU take a poop and fall on its face.

You can't have it both ways. Either games get to look better, or you stay on the older games and have the performance. The fact is? Nvidia and AMD have not been pushing very hard at all on raster performance. I repeat - neither of them are stupid. The days of buying, for example, a 1080Ti and it lasting you years are over. Why do you think Nvidia wanted to do ray tracing so bad? because they knew it would be over a decade before any of their GPUs can run stuff like that properly. It was reinventing the wheel, and putting us back years and years to create more sales for them.

These games are not poorly optimised. They are made using the tools devs have, and nothing more. I mean, let's take the recent "The Last Of Us" debacle for example. They lovingly recreate a game, make it look absolutely incredible, then the little P1ss ants all whine that their 3060 doesn't have enough VRAM and they can't run it at max settings. So what do they do to "optimise and fix" it? they compress all of the textures. Which makes it look like anus, and nowhere near as good as it looked before.

Don't blame people for pushing the envelope with games. Blame the real culprits. Nvidia could have made the 4090 WAY bigger and far more powerful. They just didn't want you to have it. Same with everything lower down the stack too. They are completely lame for the technology they are on. All of them should have been bigger, badder, with more VRAM and much bigger muscles. Yet Nvidia are just too greedy and squeaky to give you any of them.


Have you played it ? I have round a mates yesterday and visually it shouldn't need a top end GPU to get more than 30FPS. If it was todays Crysis in the graphics department then ok sure but it's not, It's far from that.
 
I have it. All I will say is do not buy. It really is badly optimised for average graphics.

Its not even worth cranking up the details, because they aren't remotely spectacular in this day and age.
 
Back
Top