I disagree from experience.
The Fury X was plenty fast enough at the time for 4k. I know because I had one (and then two, but let's not go there) and it had no problem dishing out games at 4k. Which at the time were new titles like Fallout 4, Dying Light and so on. It had plenty of horsepower to get you 40 FPS or more, which at the time was about the expectation of a single GPU in 4k. Times were very different.
What I do disagree with are comments about it had enough VRAM. It did not, and again I experienced this quite a few times first hand. The main culprit was BLOPS II. It would run out of VRAM, start crawling and then crash. Like, hard lock the rig. Activision "fixed this" by introducing a sniffer for the Fury X which disabled "Extra" detail. And thus alleviating the issue with the Fury X, if you were happy with sub par graphics over the 980Ti, which did have enough VRAM to handle it. You could hack it back into action quite easily but the same problem persisted.
AMD did eventually "fix" this of course. By "fix" I mean copy Nvidia's past tricks of streaming the rest into your paging file. This stopped the crashes but would net you single digit FPS in those areas, sometimes as low as 4. It was not all areas of the game which suffered this issue, just certain parts in certain levels.
Now we move onto 8gb. Which seems to be the mainstream amount for 1440p cards. It's not enough. This has already been conclusively proven over several games, even at 1440p. The 3070 should remain on par with the 2080Ti at all times but it does not, and, that means that the 2080Ti is still a semi capable 4k card. How long that will remain to be true? IDK.
Now usually at the suggested prices for these 30 cards? it wouldn't be a huge deal. However, when people are selling the 3070 scalped at as much as a 2080Ti would have cost early last year? yeah, then we run into issues.
As for reducing settings? sure, every one can do that. However, even at the retail prices for these cards I would not want to do that. Especially when I am doing it because Nvidia were tight and squeaky. They still want top dollar for their cards, but they are *never* going to give you the performance you would expect for the timeframe you would expect. Usually spending £600+ you would expect to get a whole gen out of your card (so skip the next one) which means four years. Can you honestly say that 8gb will last four years? with the new consoles out and next gen graphics and stuff on the way?
It's very easy for someone to sit there and say it is not a problem, when they have never experienced the problem themselves. If cancer does not run in your family then you could think you have little reason to worry. That does not mean you won't get it yourself.
And yes, I totally agree that too much VRAM does nothing for anything. It doesn't even make your Epeen look bigger because it does absolutely nothing for performance. However, when it is just not there and you need it you are totally screwed.
TBH this should not even be a discussion. At the prices being charged for these GPUs, especially AMD's which lack features this should never ever be an issue. At least in their case they have made sure it won't be. Nvidia on the other hand? are starting to play the game of trying to make you buy *every* single new refresh which IMO sucks ass.
If, say, the 3070 soon reaches the point where 8gb is not enough and that becomes gospel around gamers? it won't be worth a toss. Which is exactly what Nvidia want.
For many years now their second hand GPUs have held onto a reasonable resale value making them not such a terrible proposition. However, you go around now and look at what the "useless" cards are worth now. GTX 780, 3gb 1060 etc. They ain't worth diddly.
And again, that is exactly what Nvidia want.