Nvidia Recommends GPUs for Rise of the Tomb Raider 60FPS high Gameplay

It's not there responsibility, it's there job. Devs are a part in this it's not all on Nvidia or even AMD. They need to tell there engines on how to take advantage of using multiple GPUs. Otherwise if they don't there's no point in having AMD or Nvidia release a profile for it as it won't make a difference. Look at Company of Hero's. It's been the cream of the crop as an example since they came out that no matter what Nvidia or AMD do, the engine simply doesn't give 2 ***** and will either give you no performance, or decrease the performance as well. So don't go whining on about it being Nvidia's fault. Nvidia/AMD/Devs all work together, it's everyone's job to get it working and it starts with the Devs and there engines.

You having a laugh? Fallout 4's engine is as old as God's dog. Nvidia have handed cash over to Bethesda to get their stamp on the game. Nvidia have performance profiled the game as a Gameworks title and for GFE performance profiles.

So how is it Bethesda's responsibility? Bethesda don't profit from the sales of additional GPUs in SLI systems. In fact, they probably lose out because one of their games will be activated as free game that the SLI purchaser sells on eBay because they already got it - so that's one game sale they'll lose. Nvidia and AMD profit from selling additional GPUs for people to use for SLI and CrossFire.

Explain to me how the responsibility lies with devs and publishers to make a niche technology work properly please. Why would they spend time and money making something that less than 5% of their customer base is going to use? I would agree that they have a responsibility to ensure that things they do don't break it, but beyond that it is the company who's in charge of drivers (i.e. Nvidia and AMD) to ensure it works. The devs use an API - most commonly it's DirectX, in order to make the game render. Unless their work is pioneering or exotic in some way i.e. it operates well beyond the API guidelines, I don't see how it's their responsibility.
 
Last edited:
Nvidia/AMD/Devs all work together

Sometimes.

FYI: When you extend an engine to use the latest graphical techniques you may break SLI support due to the architecture of the renderer. However, it's not always viable to re-write the entire renderer and integration just for AFR. One of the largest engines to date, UE4, doesn't even support SLI/CF..
 
Last edited:
Sometimes.

FYI: When you extend an engine to use the latest graphical techniques you may break SLI support due to the architecture of the renderer. However, it's not always viable to re-write the entire renderer and integration just for AFR. One of the largest engines to date, UE4, doesn't even support SLI/CF..

Well yeah not always:p
And thanks for the in depth analysis but as many of us know not all games work 100% with AFR/etc. But the point I was making is it's not always Nvidia/AMDs fault like the other guy said, it's also the devs who need to at least support it. And the stuff you are pointing out also validated what I said:)
I'm not entirely blaming the devs but it starts with them:)
 
Not really sure what your response was saying. Was it agreeing with my comment or disagreeing - I don't know.

In response to yours, so you're happy with it but presumably you've not tried to play any of the games I've cited. If you did, your £1000+ GPU investment would provide you with practically no benefit compared to someone who's paid half what you have.

What surely does not make sense, even to you - someone who seems quite happy with their SLI - is that Nvidia aren't even optimising SLI profiles for Nvidia Gameworks games. So even games that sport "The way it's meant to be played" slogans don't get optimised for SLI. So essentially Nvidia take the money you spent on or more additional GPUs and, instead of spending resources, time and money into making sure those titles run superbly on multi-GPU systems, they instead spend it on a marketing deal with the developer and publisher to sell more GPUs.

I completely disagree that nVidia should kill off SLI and that it's only good for e-peen. No single card solution can effectively run 4k and until the 980Ti came along the same could be said for high FPS 1440p. I wouldn't attempt to dismiss your statement that SLI doesn't work efficiently in all titles, however in my experience it nearly always does. The performance yields of 2 or 3 GPU's are exceptional in benchmarks and quite a lot of games. It's definitely not impossible for developers to get it right. Correct me if i'm wrong but Fallout's dev's had all of the same opportunities to optimize (or implement) SLI support as Rockstar did when developing GTA V.

Are Titan X customers annoyed when those who bought 980Ti's for nearly half the price get equal or higher performance? Maybe, you do the best research you can at the time of purchase, if SLI doesn't benefit your application then it would be an illogical choice. Sadly nobody can predict the future and what further developments will come to hardware or games or API's.

What nVidia do with their money is up to them, i'm sure they don't treat the profits from a second, third or eleventh GPU any differently from the first and it's not like SLI bridges cost hundreds of pounds. Until their competitors put more resources into multiple card support and game optimization they will remain competitive. I don't think that SLI is the best thing in the world and everyone should have it however I do believe that it justifies it's cost and troubles in multiple applications.

To an extent nVidia have moved away from SLI. When I built my first rig with two 570's it was necessary to run three full HD monitors in surround, something which a single card can easily handle now.

JR
 
Last edited:
Yeah SLI and Crossfire have been made to sound pretty bad lately. Most of it though is just BS.

I've only found about five games that don't work out of a whole library so far. Tomb Raider should come with full support and UBI are touting multi GPU in a couple of their upcoming games.

Hopefully at some point soon though support won't be something that will even need to be worried about, as DX12 should include support at a low level in the API that means it will work regardless of whether a dev has included support.

However, I have been thinking and wanted to say the following.

If I were in charge of a software company and my name was on the line I would want to make sure my game worked. It's BS to go around blaming it on anything but your own laziness.

Fallout 4 IMO wasn't nearly ready enough, and if the rumors are true that it had been 'finished' (if you could call it that) for ages then shame on Bethesda. The game is OK (I've literally stopped playing it completely now) but it runs like utter turd and looks it too. I had to run it at 1080p for a while and god, it was awful. I mean, it was bad enough at 4k but holy crap it looks awful at less than 1440p.

And that's down to the company who made it.

TBH the success of the first Tomb Raider (well, the first of the reboots) was down to the fact that it looked stunning and supported everything. People could just load it up and (after a couple of driver updates and patches) see their hardware doing what they had paid for it to do. The game was great (though not spectacular) but it was helped along by the fact that you could max it out and it looked stunning.

The same goes for Dirt 2 IMO. It looked stunning and it just worked. Crossfire, SLI, you name it.

And that goes a long way for me.
 
Back
Top