Yeah, I don't wanna be accusatory, but I wonder how many of the ones saying how terrible it is have actually experienced it first-hand for a reasonable amount of time and have come to an accurate conclusion themselves. That said, more and more people who have experienced it first-hand are repeating the same thing so it's likely it really is bad. I just can't say for sure until I see it myself. And the likelihood is I won't.
I can't see DLSS taking off or going much further. Nvidia will improve the AI and reduce blurriness or decrease overhead, but unless consoles start adopting it or some key factor of it, or it becomes a genuine and all-inclusive alternative to more traditional anti-aliasing techniques, it doesn't strike me as a technology that will become mainstream or an integral part of a game and its engine. It seems to me like it's more of a way in which to increase Turing prices (by adding already-developed Tensor cores that are highly profitable in automated fields) and reducing RTX overhead in defence of its incredible demands and Turing's immaturity as a RT architecture.
It's new tech. As with everything, it will improve as it matures. At least Nvidia have confessed it's not as good as it should be. But in my eyes, I don't see it going the way of the Dodo. More like Gsync where it will still be available, but the hype has disappeared entirely.