It never ceases to amaze me how many people have those theories when companies push gaming to a new level. Face it, the dev cycle is up, and it is what it is. No one complained when the original Crysis made their GPU take a poop and fall on its face.
You can't have it both ways. Either games get to look better, or you stay on the older games and have the performance. The fact is? Nvidia and AMD have not been pushing very hard at all on raster performance. I repeat - neither of them are stupid. The days of buying, for example, a 1080Ti and it lasting you years are over. Why do you think Nvidia wanted to do ray tracing so bad? because they knew it would be over a decade before any of their GPUs can run stuff like that properly. It was reinventing the wheel, and putting us back years and years to create more sales for them.
These games are not poorly optimised. They are made using the tools devs have, and nothing more. I mean, let's take the recent "The Last Of Us" debacle for example. They lovingly recreate a game, make it look absolutely incredible, then the little P1ss ants all whine that their 3060 doesn't have enough VRAM and they can't run it at max settings. So what do they do to "optimise and fix" it? they compress all of the textures. Which makes it look like anus, and nowhere near as good as it looked before.
Don't blame people for pushing the envelope with games. Blame the real culprits. Nvidia could have made the 4090 WAY bigger and far more powerful. They just didn't want you to have it. Same with everything lower down the stack too. They are completely lame for the technology they are on. All of them should have been bigger, badder, with more VRAM and much bigger muscles. Yet Nvidia are just too greedy and squeaky to give you any of them.