Actually, NVIDIA is kinda shafting us here. They moved the naming scheme down 1 model, so for example a 2070 should be a 2060 (previously the 104, now 106), the next generation from a 1060, but the prices aren't moving in the same way. Tho that's not really all new with this generation. Additionally, the cards only have 14Gb/s GDDR6 instead of the 16 that all 3 companies already offer.
And it's not like we don't need powerful GPUs. Unless you're on 1080p, we are limited. Very. We're way past the 1080p era and 4K is here. Running games on 4K is actually pretty difficult if you're shooting for high framerates. You can forget about having high FPS in 4K even if you do find a monitor with high refresh rates. Some (tho very few) AAA games even struggle to get 60FPS in 4K. And even if you move down to 1440p, there's still a lot to be desired. There's just so many games that you can't run at high enough framerates to utilize the high refresh rates on monitors.
To me it's just a naming scheme. What definitive proof is there that by shifting the RTX 1070 to GT106 (or whatever it is), Nvidia is inherently pocketing money while reducing performance? That's the impression I get from a lot of people, that they feel Nvidia are forever finding new ways to screw consumers over. I feel the same way about the argument that the GTX XX80 (or previously X80) still being a flagship card but with a die size of a midrange card is inherently anti-consumer, that consumers are being shafted with high prices for a small chip. To me, that's highly speculative and damning. Yes, if you were to view things pessimistically and theoretically, it's possible Nvidia have pocketed massive savings while consumers suffered. Yet, here we are with almost all the naysayers happily gaming on their Pascal GPUs. And yet also, no one can actually prove that Nvidia have pocketed all the extra savings they've fobbed off because we don't know the operating costs of researching, developing, manufacturing, advertising, shipping, etc of Pascal or anything else Nvidia do. All we have is a few numbers that we feel should match up to our expectations and a whole lot of bitterness.
Is 16Gbps memory available en mass though at an affordable price?
I disagree that 4k is here. For consoles it's emerging and PC gaming it's ticking along at the same pace it has done for years, but it's still relatively elusive. Many people game at 4k, but only 60 FPS. That's a very manageable number to hit consistently with a 1080Ti. If you insist on every setting at the highest they'll go with every game, which incidentally I don't believe many developers care about you doing, then yes, a 1080Ti won't cut it. But in my opinion we have enough horsepower for current games at everything except 4k/120hz, which is as elusive as 4k/60Hz was five years ago. I would argue that 4k for consoles is growing at a faster rate than for PC gaming.
1440p/144Hz is very possible with a 1080Ti or even Vega 64 or GTX 1080. Simply adjust some of the more superfluous settings and you're there. I recognise that not all gamers are willing to do that, but I won't mistake that for 'not being powerful enough'. Because what exactly is 'not powerful enough'? Millions of gamers are quite happy to game at 960p/30 FPS in some games on console. They experience the stories with huge smiles on their faces. I've regularly and contentedly reduced settings in games to hit the sweet spot of 100 FPS or more with my GTX 1080 at 1440p. The fluidity of 100 FPS is worth more than higher resolution dust particles or the highest anti-aliasing in a fast-paced game. I understand that PC gamers are a particular bunch who like things a certain way, and I respect that, but it comes to a point when it starts to look like a spoilt teenager saying he won't play football for his team unless all the boots match.