if you think you are going to get RTX 2070 performance (currently at about 480$) for 330$, you are dumbass. Literally.
There's only so many times the GPU market can keep knocking out the same sort of performance for the same price. Or rather, the same performance for more money. Eventually what happens is everything goes stale and everything just collapses. It happened in the 1980s and it will likely happen again. Basically the early computers came along, got no better, people got sick of the same systems and then it all just folded.
Also I think a lot of people don't quite understand that Vega is a lot better than they think it is.
Saw somewhere in this thread that the 2060 is faster than the 56. That depends on your testing methodology. This was a problem with the FX CPUs, too. If you compared them with highly threaded apps and doing what they were supposed to be doing you basically had a I7 950 IPC without HT but 8 cores. If you could use them they were good CPUs, and had instructions that cheaper Intels lacked, IE the stuff you needed to run VMs and so on.
In good "proper" DX12 games* the 56 is quite a chunk faster than the 2060. In fact, it nips at the heels of the 2070 more often than not. Most people just put this down to the game being optimised for AMD but I have noticed a large hole in that theory, and that's that basically half of these supposed AMD favouring games are actually Nvidia titles (like BF5, SOTTR and others).
I've also noticed that all of the games that "favour" AMD are also good with DX12. IE - you get more performance using DX12 than you do using DX11. And, to add to it all the games that are better on Nvidia are worse in DX12 and actually perform worse.
What that means to me? that the game has not been coded specifically for DX12 and it's either been patched in or the engine was not designed solely with DX12 in mind. And the games that do perform well on AMD? well, they are all from large software houses who have a shorter dev cycle than others.
*as explained above I think there's no such sorcery as "optimised well for Vega" I think that Vega is simply better than Nvidia's older Pascal and low end Turing cards at DX12. Because after all, Vega is a DX12 architecture that was designed as such ages ago. This is why it gets hot and uses more power, because it's got the balls for DX12.
Which, if you look back at the FX series you can understand, because that was a very forward thinking tech as well. Sadly AMD like to forward think too much, and that hasn't worked very well for them yet. I mean crap, even Nvidia said that the 7970 was AMD's Fermi and they've stuck to their guns ever since.
This is also where the "fine wine" saying comes from, though personally I don't think it's any such thing as the card or drivers improving I think that it's just at some point AMD's risk on future stuff pays off.
If I were not fussed about RT as it sits, and if I was forward thinking I would rather have the 56 over the 2060. Quite simply because when we eventually DO live in a world where everything is DX12 and the DX11 dev cycle has ended and has been put to bed? it's a better card for DX12.
You need to remember, Turing happened for a reason and it was not just RT. Basically Pascal cards were beginning to show their weakness and Nvidia knew that in two gens time AMD would be leading. After Fermi Nvidia made different GPUs entirely. The GTX 280 was a massive hot pig, and so was the 480. So they cut down the core size then realised just how high a smaller core could clock on a die shrink.
Since then? spoon feeding us tiny little dies because DX12 didn't matter yet. AMD on the other hand chose the other path. Keep making massive how GPUs hoping that DX12 would come along and justify it.