I don't agree with these arguments.
When GTA V came out, the main big boy of the time was the 980Ti. It was marketed and used very happily as a 1440p capable GPU. 90+% of games ran very well at 1440p with a 980Ti, including GTA V. But in the settings menu of GTA V there was a sub-menu, and if you cranked those settings and drove to a grassy section in the map, and there were many of them, performance tanked to 30 FPS or lower. Does that mean the 980Ti was incapable of running 1440p gaming? Same applies to AC Unity, Arkham Knight, and many more games.
That's one example of a card being perfectly capable of running 1440p admirably, yet struggling in certain extreme, and very rare, circumstances. We've seen this for years and years and yet no one claimed the 980Ti was "unacceptable" at 1440p, because within reason it was absolutely fine. That was 'the' rig to have at the time. It was the ASUS 1440p/144Hz monitor with an aftermarket 980Ti. Groundbreaking gaming for many people. People had an absolute blast.
At the price of the Ti and other Tis I expect to be able to crank all of the settings to the max, yes. I don't think that is unrealistic. I had a Titan XM and everything I played could be cranked to 1440p max settings and it handled it fine. Then again you know me and I am quite happy with a 30 FPS min, so long as I am using adaptive sync (so Freesync and Gsync) IE, without the choppy.
4k to me at least did not become a reality until the Titan XP and 1080Ti came out, but once again it was a fast food snack. Within 6 months new games came along and there went your 4k settings cranked.
Like I say, on a £1300 GPU I don't see the point.
The 3070 is potentially a 4k capable card at £400+ for 90+% of games. That's groundbreaking gaming for many people. People could have a blast.
Do I think everyone should buy a 3070 for 4k gaming? No.
Where or when have I given you the impression that I have an unrealistically enlarged view of Ampere? The prices surprised me in a good way. That's about it. I do see many people being blinded by the prices, but I also see a lot of scepticism and negativity. Benchmarks are leaking out showing the 3080 around 50% faster than a 2060 Super. The GTX 1080 at launch was 52% faster than a GTX 970, Maxwell's equivalent to Turing.
I don't believe the 3070 to be an ideal 4k for everyone out there. I believe it's 4k capable for a certain demographic. That's exciting to me, because I'm not a catastrophiser and I don't want exaggerate. Neither do I want to dictate that everyone should have the absolute best or go home crying into their GTX 760 pillows. I want gamers with £450 to play certain games at 4k, and it looks like both Nvidia and AMD are going to be offering that. I'd recommend AMD over Nvidia, but I'm not blind to a very large reality: the 3070 might be a 4k capable card for reasonable money... with a caveat. Nvidia might try to pull the wool over people's eyes, but we are each responsible for ourselves. All it takes is five minutes of Googling to see that a handful of games (big games) draw more than 8GB of VRAM. If you play those games or want to keep the 3070 for five years, don't buy it, it's not for you. But if you're in a different category, the 3070 might be perfectly fine.
Right I will try to cover as much of that as possible.
Firstly with regards to the 3070. We are now entering a "4k or GTFO" generation, IMO. If the consoles can do it then you should expect a GPU, *any* GPU that costs more than said console to do the same at the same settings. Only we know that with the advent of the XB1X that flipped on its head. All of a sudden a PC costing twice as much did the same thing. THAT is why I was so very impressed with the XB1X, and why I bought two. Because I had been chasing that dragon on PC for years and spunked many of thousands of pounds down the bog.
However, as I said, gamers are going to expect it now. And if a card does not fully meet the requirements and costs more than the console? like I said, bad card. I don't care what people do with their money that's their lookout fella. I just don't like BS, and this whole 30 series release has been the back end of a male cow. As more days pass and more and more comes to light the more we find out how "not anywhere near as good as it sounds" Ampere is. It's basically the hog it was predicted to be.
I didn't assume that you have an enlarged view of Ampere. After the announcement and the horror show since (sub £500 2080Tis any one?) I just wanted to remind people that Jen has form for talking crap. That's all. He uses the typical "Up to" BS.
As for me being a "catastrophiser?" no. I am a realist.
Not only am I a realist but I have an awful, awful lot of experience of running 4k. That, given my total disbelief of how the 3070 is going to perform (on the same VRAM and 3gb less of it than the 2080Ti !) is why I am telling you to be careful.
Today it has come to light that the 2080Ti is 24% slower in real world situations than the 3080. That means the 3080 is 32% odd faster while consuming nearly 30% more power.
If you really thought the 3070 would match that? IDK. I know you are not that silly. However, even if it does match it it doesn't have enough VRAM.
The "best" RT game so far apparently is Control. Imagine buying one, wanting to try out this supposed best RT game and then finding yourself running out of VRAM?
And, the price sounds too good to be true because it near certainly will be. So it's "cheaper" than Turing right? yeah it is. However, that doesn't make it cheap ! it still costs more than the XBSeX. Which apparently will run every single game that is coded for it at 4k.
And then of course, AMD have not had their go yet. Whilst every one has discounted them? like I said to Dice yesterday if they had a blinder and Nvidia as we know had a crap day at the office? (320w TDP GPUs ORLY?) then that could change fast.