AMD showcases impressive Multi-GPU scaling in Deus Ex: Mankind Divided

Nvidia really need to sort thier dx12 performance before people start going red as these numbers are impressive. Vega could be the pascal killer
 
I have to say, this is very impressive, good job folks at AMD. nVidia really needs to step up their game.
 
Oh please let this somehow translate into Nvidia having competition again (eventually). Pretty please!
With 2 RX480s costing £500 I doubt it, and games using DX12 are thin on the ground. The only thing NVIDIA need to do is optimize their drivers with proper DX12 cover and work on their scaling again, that will be enough to put AMD firmly behind again.
 
With 2 RX480s costing £500 I doubt it, and games using DX12 are thin on the ground. The only thing NVIDIA need to do is optimize their drivers with proper DX12 cover and work on their scaling again, that will be enough to put AMD firmly behind again.

But DX12 performance and multi gpu scaling have been an issue for nvidia since DX12's releasse and they have yet to release a driver that fixes it so I dont think its that easy. I recon its a hardware limitation
 
Last edited:
First of all AMD have partnered with Microsoft so this should be a "no brainer" to you boffins :D
 
But DX12 performance and multi gpu scaling have been an issue for nvidia since DX12's releasse and they have yet to release a driver that fixes it so I dont think its that easy. I recon its a hardware limitation

It is a hardware limitation. Nvidia has hardware workarounds to help negate performance loss, but it just in reality only lets Nvidia maintain performance parity with DX11 and in some few(very few) instances gain a little performance. AMD and I believe it was a hitman Dev sometime last year both said it cannot be fixed by a driver(that is Asynchronous Compute cannot be emulated of sorts) so Nvidias less compute power is going to bring it down.

But Nvidia is still capable. Just need more time to get it implemented but based off what they did with Pascal and Maxwell, seems like they are only just making workarounds instead of tackling the problem. And with a "$2bill R&D budget" they constantly bring up for Pascal that should say a lot about how they will address it in the future. That much money and barely any advancements. On top of the fact Pascal is not much different compared to Maxwell.

Honestly the most impressive thing about this scaling, is two "budget" GPUs like the 480 are pretty cheap, and this performance gain is incredible value. Just imagine what Vega could do, two Fury's(whatever it's called) would easily be the fastest combo around. Just remember it's only one game, but if this becomes common place soon, it's a huge benefit for AMD.
 
Honestly the most impressive thing about this scaling, is two "budget" GPUs like the 480 are pretty cheap, and this performance gain is incredible value. Just imagine what Vega could do, two Fury's(whatever it's called) would easily be the fastest combo around. Just remember it's only one game, but if this becomes common place soon, it's a huge benefit for AMD.
Yes and no, Scaling is great and all but not all devs use engines that support it and implementing dual card scaling for every game is counter productive and very much an old PC niche (cough Consoles), NVIDIA know this and that's why we have the whole SLI thing up in the air. As gamers we know that a powerful single card is all you need, AMD are yet to bring something to the table.. the graphs above are a joke when it comes to this.. £500 for 2 RX480s! Sod that I'll take a GTX1070 and be happy knowing it can do it on it's own.

Come on AMD show us some of that power you keep teasing.
 
Just imagine what Vega could do, two Fury's(whatever it's called) would easily be the fastest combo around. Just remember it's only one game, but if this becomes common place soon, it's a huge benefit for AMD.
That's kinda what I was referring to. I know this doesn't help AMD much *now*, but it might be an indicator that they could return to competition in a hurry by doing stuff like this with a next-gen GPU.
 
Yes and no, Scaling is great and all but not all devs use engines that support it and implementing dual card scaling for every game is counter productive and very much an old PC niche (cough Consoles), NVIDIA know this and that's why we have the whole SLI thing up in the air. As gamers we know that a powerful single card is all you need, AMD are yet to bring something to the table.. the graphs above are a joke when it comes to this.. £500 for 2 RX480s! Sod that I'll take a GTX1070 and be happy knowing it can do it on it's own.

Come on AMD show us some of that power you keep teasing.

I know this. Which is why I said it's just one game, and we will have to wait for it to be common place
Multi GPU is still relevant. Most people can't afford a one time big purchase. So by getting a cheaper card and later when prices fall or second hand becomes more available, then it's a huge upgrade. This is the benefit of it. And as I sort of implied earlier, higher end cards in MGPU makes more sense, already at the top of the food chain, adding another one makes it that much better. But this is for enthusiast level, which defeats the purpose of the RX480 comparison. It's still a great card by itself for up to 1440, so by itself it's perfect still. Later down the line, 2, is just that much better and a cheap way to do it.

Rumored AMD announce Vega 10 around of the year, launch should be later after Zen. Really just want them to hurry up. Everyone is dying for Vega to come along. We need it bad
 
First of all AMD have partnered with Microsoft so this should be a "no brainer" to you boffins :D

Actually it really doesnt matter now. nVidias poor DX12 performance is simply down to them being lazy and milking what they have for as much as they can.

To the majority of gamers, DX12 isnt going to be the biggest concern until at earliest next year given that most games are just fine on DX11.
 
Nvidia really need to sort thier dx12 performance before people start going red as these numbers are impressive. Vega could be the pascal killer

This doesn't really mean anything in the greater scheme of things if it doesn't become the norm with all new releases, And that's not gonna happen.

With 2 RX480s costing £500 I doubt it, and games using DX12 are thin on the ground. The only thing NVIDIA need to do is optimize their drivers with proper DX12 cover and work on their scaling again, that will be enough to put AMD firmly behind again.

AMD haven't really attempted a comeback yet, They haven't tried to compete overall, There current 14nm cards are targeted at the low and mid levels, They've done that because they haven't got anything available to respond to the high level Pascal cards, They haven't even got a response to the -70 level card and really that's not a high end card. Once we get all the Vega releases on the table and find out how they do and how far away an Nvidia response is then we can look at whether AMD are back in the game. As it stands right now Nvidia are miles ahead to the detriment of us all.
 
This doesn't really mean anything in the greater scheme of things if it doesn't become the norm with all new releases, And that's not gonna happen.

Sooo you're saying DX12 is never going to become the norm? Just as DX11 isn't the norm? Or just as DX10 wasn't the norm before DX11? Why did they develop it then? For fun and giggles? :rolleyes:
 
Sooo you're saying DX12 is never going to become the norm? Just as DX11 isn't the norm? Or just as DX10 wasn't the norm before DX11? Why did they develop it then? For fun and giggles? :rolleyes:
It has always been a slow transition between APIs, what it all boils down too now are the game developers to implement DX12, we know that the hardware supports it but as was discussed at the GDC it is now in the hands of the devs not the Drivers..

As it stands AMD are getting their feet firmly in the doors of most devs also with the consoles sporting AMD hardware there is a reason for all the DX12 = AMD wins chatter which to be honest is what NVIDIA need to be doing too.
dL0UtfB.jpg
 
SLI and Crossfire systems are more common than is eluded. It's not just water coolers who are running 1080 SLI configurations. I've seen so many 980ti SLI rigs, and that was not an ideal setup considering the high temperatures the 980ti could produce and the sad state of multi-GPU support. It was also a very expensive investment.

I agree with NBD when he says that people might buy an RX 480 now and then add another one six months later for a pretty beastly setup that's more than capable of 1440p 144Hz gaming with some tweaking. If games supported it as well as Deus Ex then it would be a stellar way to go, especially with the efficiency gains AMD has managed to milk from Polaris. Temperatures and power consumption should be very manageable in Crossfire with the new advancements in production.
 
It has always been a slow transition between APIs, what it all boils down too now are the game developers to implement DX12, we know that the hardware supports it but as was discussed at the GDC it is now in the hands of the devs not the Drivers..

As it stands AMD are getting their feet firmly in the doors of most devs also with the consoles sporting AMD hardware there is a reason for all the DX12 = AMD wins chatter which to be honest is what NVIDIA need to be doing too.
dL0UtfB.jpg

DX12 and consoles don't transition to PC very well. Sony can't use DX12, Xbox can.
Most DX12 titles are PC only anyways. And in those titles, AMD is the clear winner. There architecture is just better suited for it. It is naturally a better compute/parallel based architecture. It's been that way since GCN was introduced back in 2011. For example, even way back then, a 7970 was able to absolutley destroy the original titan in any OpenCL/parallel based software.
Nvidia can improve upon this, but even with there big budget, they keep failing to do so.
 
DX12 and consoles don't transition to PC very well. Sony can't use DX12, Xbox can.
Most DX12 titles are PC only anyways. And in those titles AMD is the clear winner. There architecture is just better suited for it. It is naturally a better compute/parallel based architecture. It's been that way since GCN was introduced back in 2011. For example, even way back then, a 7970 was able to absolutley destroy the original titan in any OpenCL/parallel based software.
Nvidia can improve upon this, but even with there big budget, they keep failing to do so.

They haven't needed to. They still don't. When Vega is launched in a few months time all nVidia has to do is release the 1080ti, reduce the price of their current GPU's, and in late 2017 release their next big architecture. Isn't it said that Volta will be using a wholly new architecture that is better suited for DX12? WCCFTech has rumoured that to be released in 2017.

In my eyes, nVidia is ahead of the curve and has made smarter choices. The RX 480 is slightly ahead of the 1060 in DX12 titles for less money, but the majority of gamers are still playing DX11 games. Future proofing as a concept has been proven a folly before. Sometimes it makes sense; sometimes it doesn't. Those that bought 1070's will no doubt buy a 1170 when it is released. If that is indeed 2017 AMD will be left with only a small window of success. Unless of course Vega is ridiculously powerful (or if Navi replaces it in early 2018). If AMD can release a GPU as powerful as dual 1080's at a price that's digestible, they'll remain competitive. Otherwise nVidia will be back on top within 6 months to a year.

That doesn't bother me personally. As long as AMD release excellent GPU's, I'm happy. Polaris was initially a slight disappointment. Now I like the range. Fury was a disappointment at first. Then it grew on me, just like Polaris. Then it fell off completely as soon as the 1080 was released. It was a decent GPU for about a year, then it was outclassed in every way. Vega needs to be a very good GPU. Even if it's outclassed within six months, a good GPU is a good GPU. The Fury line was decent, but nothing special. The 290X was better, in my opinion.
 
Back
Top