Deus Ex: Mankind Divided will not support DirectX 12 at launch

This is getting a bit silly now. Are we ever going to see this much-hyped DX12 come into fulfillment? I thought Deus Ex was going to be the first thoroughbred DX12 game that fully realises its potential. I found out I was wrong about that last week back when someone clarified here the game would support both API's. But the absence of DX12 at launch is just embarrassing for both AMD and Microsoft. AMD banked on a very slow moving API. Anyone who bought Fiji are still waiting for their cards to fully shine. With Microsoft butchering games on Windows, the future doesn't look as bright as the initial leaks surrounding DX12 specifications suggested. By the time DX12 games actually come out and are patched to a point of acceptability and playability, Volta will be out and fully support async compute. Everyone has been saying that AMD are playing it smart, but it's becoming more and more apparent to me that nVidia is the one that knows exactly what is going on.

I could be wrong.
 
My concern is; the amount of data that new games use, it is insane and the SSD prices are still ludicrous when compared against HDDs.
 
Last edited:
This is getting a bit silly now. Are we ever going to see this much-hyped DX12 come into fulfillment? I thought Deus Ex was going to be the first thoroughbred DX12 game that fully realises its potential. I found out I was wrong about that last week back when someone clarified here the game would support both API's. But the absence of DX12 at launch is just embarrassing for both AMD and Microsoft. AMD banked on a very slow moving API. Anyone who bought Fiji are still waiting for their cards to fully shine. With Microsoft butchering games on Windows, the future doesn't look as bright as the initial leaks surrounding DX12 specifications suggested. By the time DX12 games actually come out and are patched to a point of acceptability and playability, Volta will be out and fully support async compute. Everyone has been saying that AMD are playing it smart, but it's becoming more and more apparent to me that nVidia is the one that knows exactly what is going on.

I could be wrong.

DX12 has been around for one year. It doesn't magically mean on release every game will use it. API adoption is slow for many reasons. In this case, very very mature and well known DX11 API. Many devs are familiar with it and know the tricks to get more out of it. In addition, drivers are very mature. Don't forget it takes time to build engine's that support this. Then you have to develop the game after that. It's just a long process. The only game that uses it well is AotS. Because it was built with DX12 first. But still it's an early API and it hasn't reachedits full potential
 
DX12 has been around for one year. It doesn't magically mean on release every game will use it. API adoption is slow for many reasons. In this case, very very mature and well known DX11 API. Many devs are familiar with it and know the tricks to get more out of it. In addition, drivers are very mature. Don't forget it takes time to build engine's that support this. Then you have to develop the game after that. It's just a long process. The only game that uses it well is AotS. Because it was built with DX12 first. But still it's an early API and it hasn't reachedits full potential

That's all true, and therein lies the point.

The Fury X was touted as the next generation of hardware for the next generation of API's. That was a folly, clearly, as Pascal now beats it in every respect. Which is exactly what even nVidia fanboys predicted would happen. The Fury line has completely dropped off, even with the introduction of its intended API. The 290X was also better suited to Mantle/DX12, but it remained relevant for over three years despite it having such heavy competition. I'm not suggesting that nVidia are inherently more future proof because the 780ti can easily be beaten by a 290 nowadays, due in part to the additional memory and up-to-date support from drivers. But the Fury X was a disappointment. The limitation of the memory, the ridiculous pricing scheme that was unfortunately necessary, the comparatively inefficient design, the AIO cooler, the limited aftermarket GPU's (Gigabyte didn't even advertise their G1 Gaming Fury; it just came out quietly). It all added up to be a tepid release that promised so much more.

To have bought a Fury X in the hope of seeing it shine in DX12 has ultimately become a failed concept. If what you say is true, and it obviously is, banking on DX12 as a slowly maturing API in a silicon war is a senseless decision. You're buying fresh ingredients to pair with a wine that has yet to mature. By the time the wine has matured, your cheese has grown mold. It makes no sense. That's my opinion. Anyone who still owns a Fury X has a solid card, but that's about the extent of it.
 
Pascal simply beats it because it's simply just faster. Taking 980SLI performance into one card would be faster than a 980ti or FX any day of the week. So it's not that the FX is crap, it's simply not enough horsepower to be in the same category. In addition to that, FX gains huge when using DX12, but still, it's simply to much power it's competing against. Solo cards is another story where the fX is fighting off every card that was in the previous generation. A 1070 would be it's closest competitor, but even then it's still simply more powerful it is basically a TX.

Also to point out. I think you are riding the Fury line too quickly. It's still very fast for today's games and as more games move to dx12 and people hold onto those card's, they just stay more relevant for longer

I don't any person bought a Fury X for the hopes of DX12. If they did, that's all on them. You never buy a GPU because of an API. I bought a Fury X yes, but my intent was getting the most I couldfor my money, and I did. Dx12 was the icing on the cake for me as I think it was for most people
 
Last edited:
Pascal doesnt fully support DX12 either so hardly better in every aspect. If I remember correctly the Fury also does better in compute? I didnt purchase any cause they were never in stock until later....
 
Pascal doesnt fully support DX12 either so hardly better in every aspect. If I remember correctly the Fury also does better in compute? I didnt purchase any cause they were never in stock until later....

Yes it does better. GCN is a compute based architecture, which is why the perform better in compute APIs like DX12/Vulkan/OpenCL. Although Nvidia has improved a lot with OpenCL with pascal. I just remember when the 7970 would absolutely dominate the original Titan when it came to OpenCL. That's the difference it used to be. Not as much anymore but still very very strong.
 
Pascal simply beats it because it's simply just faster. Taking 980SLI performance into one card would be faster than a 980ti or FX any day of the week. So it's not that the FX is crap, it's simply not enough horsepower to be in the same category. In addition to that, FX gains huge when using DX12, but still, it's simply to much power it's competing against. Solo cards is another story where the fX is fighting off every card that was in the previous generation. A 1070 would be it's closest competitor, but even then it's still simply more powerful it is basically a TX.

Also to point out. I think you are riding the Fury line too quickly. It's still very fast for today's games and as more games move to dx12 and people hold onto those card's, they just stay more relevant for longer

I don't any person bought a Fury X for the hopes of DX12. If they did, that's all on them. You never buy a GPU because of an API. I bought a Fury X yes, but my intent was getting the most I couldfor my money, and I did. Dx12 was the icing on the cake for me as I think it was for most people

The Fury X isn't a crap card, but it was outclassed.

I don't think I'm riding the Fury line too harshly or too quickly. My Fury is still doing exactly what it was doing when I first bought it. There are no games I've played thus far that has been a challenge for it. But I wouldn't buy a second one and I wouldn't recommend anyone buy it unless you could find one for £200, which is unlikely.

The point I was making was how relevant the 290/290X remained even after the 970/980 were released two years ago. You could have bought a 295X2 for £500 at one point. A nice second-hand Sapphire 290X for £180 would have been a great 970 competitor or even 980 competitor. Not so with the Fury line. That's the point I'm trying to make. There is virtually no reason to buy a Fury X over a 1070.

Whether anyone bought a Fury for DX12 support or not, it was discussed at great length by a lot of gamers and PC enthusiasts and was one of its main supporting arguments. I think it is very fair to assume that many bought the Fury X on the premise that DX12 was going to be its greatest strength and would make its drawbacks worth it.
 
Back
Top