XCOM 2 PC Performance Review - AMD VS Nvidia

WYP

News Guru
What kind of hardware is needed to push back the oppressive Aliens in XCOM 2? Are our Earthly GPUs up for the task?

07164820275l.jpg


Read our XCOM 2 GPU Performance review.
 
Performance on a TitanX maxed @2160p is much better than a GTX 980 Ti.

Below I think is the reason why.

Check out the memory usage.

jfRWrTt.jpg
 
These new games really are complete VRAM pigs. I kinda knew this would happen once the consoles had loads of available VRAM but nowhere near as bad as this.

With BLOPS 3 the newest update removes the extra settings from my PC (Fury x 4gb). If I hack it and enable extra settings it either black screens on load or very rarely does load up. However, within a few seconds it turns into a slide show.

And I was having the same issue with ROTTR when setting everything as high as it would go. It was fine for a couple of minutes, then it turned into a slide show and on a couple of occasions it actually stopped and ground to a halt and took about two minutes before it continued on. Now that I have lowered a few of the settings to high and left the main detail setting on very high it flies along lovely.

The problem of course is that these new consoles have up to 6gb vram available to them for 1080p . So instead of the devs optimising their textures and so on they are simply making them so that they use up that entire 6gb of texture memory at 1080p.

All of a sudden Titan X sounds far less stupid than it did when it launched.

I don't think it will be long before the 6gb the 980ti has becomes a minimum requirement for max settings, and could even be quite soon before that 6gb becomes out of date.

So much for AMD and their "It doesn't matter because it's HBM and it doesn't work like GDDR". Yeah, right.

Saying that though I do not blame AMD. As I said above fat bloatware is what causes these issues. I've got a whole ton of games that look fantastic at 4k and don't use anywhere near the same vram as these newest games.
 
I don't think it will be long before the 6gb the 980ti has becomes a minimum requirement for max settings, and could even be quite soon before that 6gb becomes out of date.

Wait till VR takes off then dude.
 
Wow, this game is incredibly demanding. That's another one added to the roster of new titles that are surprisingly demanding. I hope this trend slows down a little.
 
Just noticed this in the conclusion.

especially when you consider the performance and visual fidelity of other modern titles like Rise of The Witcher 3

Is this a new crossover game ? :p

On a side note I just played a bit of XCOM 2, Cranked up the settings to max minus AA which was off and was getting around 60FPS at 1440P with an overclocked 980 Ti.

For the graphics this game has I can't understand why it demands such high end hardware as it's not exactly mind blowing or as WYP said any better than TW3, Battlefront which generally have decent performance.

Bad optimization maybe ?

nf1jEx4.jpg



*EDIT*

At 1080P maxed out minus AA which visually makes very very little difference, The performance is actually pretty good and that's with no overclock on the GPU at all :)

BRst4Zi.jpg
 
Last edited:
You think it's a case of a memory leak for the GPU? Can't think or a different reason a game like this would use 10GB of vram

I don't think it is.

At 1080P with everything minus AA it sticks at 2.50GB and only goes up and down a little so 10GB at 4K makes sense :)
 
I don't think it is.

At 1080P with everything minus AA it sticks at 2.50GB and only goes up and down a little so 10GB at 4K makes sense :)

The textures are the same, it's just adding more pixels. It's not linear where you multiply by 4 and at 4k it makes 10GB of vram needed. That's why every other game doesn't use much more than 4GB even at 4k(a reason why the Fury X at 4k is still capable of running 4k even with 4GB HBM). A game like GTAV would use far more than 10GB of vram if that were the case:)
 
The textures are the same, it's just adding more pixels. It's not linear where you multiply by 4 and at 4k it makes 10GB of vram needed. That's why every other game doesn't use much more than 4GB even at 4k(a reason why the Fury X at 4k is still capable of running 4k even with 4GB HBM). A game like GTAV would use far more than 10GB of vram if that were the case:)

I know that.

But as the game has access to more memory on the TX it only makes sense that at 4 times the res it CAN if it needs to use 4 times the memory as it's sitting there ready to use.
 
Last edited:
The textures are the same, it's just adding more pixels. It's not linear where you multiply by 4 and at 4k it makes 10GB of vram needed. That's why every other game doesn't use much more than 4GB even at 4k(a reason why the Fury X at 4k is still capable of running 4k even with 4GB HBM). A game like GTAV would use far more than 10GB of vram if that were the case:)

There are a growing number of games that use a lot more than 4gb.

4gb of HBM is nowhere near enough but it only becomes obvious if you use CF with 3 or more cards at 2160p.

The other thing to remember is XCOM 2 is not a console port. We are going to see more and more PC games with these sort of memory demands and people need to accept it rather than bury their head in the sand.

The TitanX may come with 12gb of vram but it is already a year old so it is not unreasonable for game devs to write titles that can use it.

Something that does worry me is the next gen of cards will likely be mid range Polaris and Pascal which means they are likely to have only 8gb of memory.
 
There are a growing number of games that use a lot more than 4gb.

4gb of HBM is nowhere near enough but it only becomes obvious if you use CF with 3 or more cards at 2160p.

The other thing to remember is XCOM 2 is not a console port. We are going to see more and more PC games with these sort of memory demands and people need to accept it rather than bury their head in the sand.

The TitanX may come with 12gb of vram but it is already a year old so it is not unreasonable for game devs to write titles that can use it.

Something that does worry me is the next gen of cards will likely be mid range Polaris and Pascal which means they are likely to have only 8gb of memory.

8GB is fine, Gotta remember people who use 4K are not just in the minority but in the extreme minority and as games are generally from what I've seen using around the 4GB mark at 1440P and sometimes below I can't see memory being capacity an issue unless you are a 4K user.

For the 4K crowd we'll probably get a 16GB HBM2.0 monster :)
 
There are a growing number of games that use a lot more than 4gb.

4gb of HBM is nowhere near enough but it only becomes obvious if you use CF with 3 or more cards at 2160p.

The other thing to remember is XCOM 2 is not a console port. We are going to see more and more PC games with these sort of memory demands and people need to accept it rather than bury their head in the sand.

The TitanX may come with 12gb of vram but it is already a year old so it is not unreasonable for game devs to write titles that can use it.

Something that does worry me is the next gen of cards will likely be mid range Polaris and Pascal which means they are likely to have only 8gb of memory.

I know there is I didn't say there wasn't. 4GB is enough atm, hardly anyone uses 3 cards in crossfire for gaming and it could probably be helped with drivers but again it's not worth the time to.

It doesn't matter if it's a console port or not, just depends on optimization really. And tbh doesn't seem like this game has it as much as it should. Also many games will use more vram if available just because it's faster to store something in memory then to reprocess it. Bench the FX and you'll see it'll be fine with FPS numbers. The TX will be the same way but just consume more memory. It's not as big a deal as you make it. Battlefield is a very good example at doing this
 
I know there is I didn't say there wasn't. 4GB is enough atm, hardly anyone uses 3 cards in crossfire for gaming and it could probably be helped with drivers but again it's not worth the time to.

It doesn't matter if it's a console port or not, just depends on optimization really. And tbh doesn't seem like this game has it as much as it should. Also many games will use more vram if available just because it's faster to store something in memory then to reprocess it. Bench the FX and you'll see it'll be fine with FPS numbers. The TX will be the same way but just consume more memory. It's not as big a deal as you make it. Battlefield is a very good example at doing this

I do bench the FX and 4gb is not up to it as demonstrated by XCOM 2.

With a single TitanX maxed @2160p the game is very playable where it is a turn based. Better still when SLI support arrives I will be able to get around 70fps.

If we see CF support it won't make much difference maxed @2160p as the fps will still be a slide show due to 4gb not being enough.

People have got to accept that upcoming games are going to need very large amounts of memory to run.

The FX is a very good card but like all GPUs it can not be right for every situation. The same goes for the TitanX, at the price they are you can nearly buy 2 Fury Xs making them far better bang for buck.
 
You think it's a case of a memory leak for the GPU? Can't think or a different reason a game like this would use 10GB of vram

Doubt it. It's pretty hard to leak VRAM and not know about it. They could be just allocating 70% of VRAM up front for texture pooling, similar to UE4. A large texture pool means less streaming, so smoother experience.
 
I do bench the FX and 4gb is not up to it as demonstrated by XCOM 2.

With a single TitanX maxed @2160p the game is very playable where it is a turn based. Better still when SLI support arrives I will be able to get around 70fps.

If we see CF support it won't make much difference maxed @2160p as the fps will still be a slide show due to 4gb not being enough.

People have got to accept that upcoming games are going to need very large amounts of memory to run.

The FX is a very good card but like all GPUs it can not be right for every situation. The same goes for the TitanX, at the price they are you can nearly buy 2 Fury Xs making them far better bang for buck.

When it arrives with DP 1.3 you should jump on the 4K 120+Hz G-Sync/Freesync bandwagon, You'll love it, Whole other world of smoothness :)
 
Back
Top