AngryGoldfish
Old N Gold
That doesn't really say much though, They could have limited both the systems to 100FPS.
A rig equipped with a RX-580 and a different equipped with a 1080 Ti can both do 100FPS in Doom no problem at 1440P, The real difference is going to come down to the games with very shiny FX that are taxing and the FPS difference there.
I'm not sure how much it's supposed to say, to be honest. It is what it is. They had planned other games but ran out of time, so I hear. Take from that what you will.
And an RX 580 will not hit 100 FPS at 3440x1440p at max settings. I can't find what settings they used in the test, but it makes sense to use the maximum, for parity sake. That's why benchmarkers use max settings; so people don't get confused. Anyway, an RX 580 would struggle to hit 60 FPS at 3440x1440p. It'll be closer to 50 FPS. An overclocked 1080Ti will hit around 100 FPS at 3440x1440p. So if folks feel there is no difference between Vega and the 1080Ti, that suggests to me that whatever performance difference there is, people can't spot it. Again, take what you want from it.
I like the concept AMD are focusing on here. They're getting away from the 'numbers don't lie' principle and are instead giving gamers smooth gameplay, which is the most important thing. That's partially flawed logic obviously because numbers don't lie, but it's contentious in the right way. The principle has some merit. They're basically saying it's about how it feels, not that the 1080Ti hits 102 FPS while Vega hits 90 FPS.