It's Finished - Nvidia has practically killed off Multi-GPU with its RTX 30 series

With Ampere, SLI is only for those who can afford two RTX 3090 graphics cards, giving multi-GPU a $3000+ price tag after you include the price of an SLI/NVLINK Bridge. For all but the wealthiest of gamers, multi-GPU gaming is dead, and it is unlikely to rise again in the near future.

Makes me think of Apple's old slogan of "does more, costs less" and how it become the complete opposite of that philosophy. Used to be an advocate for SLI right up until 5xx. Had 6700's, 8800's, 9600's, 260's, 280's, 295's(though this was not as good as I thought it'd be) and 560ti's. up until 560's I got really good scaling then it just started going down hill quickly.
SLI used to be a way of getting that bit extra for your money with the whole idea to buy one and later down the line if you can afford it, get a second to increase performance, now its getting a bit more money, nothing else.
RIP SLI
 
This was to be expected really, Sli was fun back in the gtx 8800 days, had 3 sli'd systems and was loving how it looked but after the gtx 570's i stopped doing that and went single card from there on out, was good fun to tinker with sli at times.
 
With higher refresh rate monitors and the like I've become much more critical of added input lag and fluctuating frame times, so this tech has been dead for me for quite a while.

As long as it's AFR, it's going to be subpar. And let's be honest, alternatives won't get adopted universally due to complexity.
 
The real problem for SLI with the 3090 is finding a modern motherboard that can cope with 2 triple slot cards, there is just not enough space between the slots these days.
 
The real problem for SLI with the 3090 is finding a modern motherboard that can cope with 2 triple slot cards, there is just not enough space between the slots these days.

That's a good point, but when you are already spending $3000+ on GPUs, a lot of SLI users could just opt to use water cooling.
 
Moving forward.

I for one have been waiting to walk away from SLI. Never have I been more disappointed with NVIDIA than after the release of the 2080ti cards and NVLink. I have always used two cards, back before sli was even a thing. Back then it was "daughter cards ", or so I seem to recall. I currently am using two 2080ti cards, and enjoy having them. Even today they still make me happy. However, SLI support has declined steadily for years, and yet I refused to face that reality. Not anymore. I feel no need to move up to the 3000 series cards at this point. All games I play run just fine, and I'll know when it is time to upgrade my aging PC. When that time arrives, I shall be putting only one graphics card in my system. Time to move on. R.I.P SLI
 
That's a good point, but when you are already spending $3000+ on GPUs, a lot of SLI users could just opt to use water cooling.
True, but not all SLI users are comfortable with watercooling. I never have had a desire to go with watercooling. Just call me old school, or anal, or both. But I agree with the comment about the space needed on a motherboard for these new wider cards. I have the Rampage V Extreme, and my sound card is a tight fit. My next build will only have one graphics card, so no more snug fits.
 
That's a good point, but when you are already spending $3000+ on GPUs, a lot of SLI users could just opt to use water cooling.

My current SLI build is a couple of RTX Titans and I prefer to keep them on their stock air coolers.

If they were a lot cheaper then I would water cool them.

Another problem with modern SLI is the bridge. NVidia only make 3 and 4 slot ones, on modern motherboards the 4 slot one is useless as the slots are 3 and 5 apart and guess what no one makes a NVLink 5 slot bridge.
 
My current SLI build is a couple of RTX Titans and I prefer to keep them on their stock air coolers.

If they were a lot cheaper then I would water cool them.

Another problem with modern SLI is the bridge. NVidia only make 3 and 4 slot ones, on modern motherboards the 4 slot one is useless as the slots are 3 and 5 apart and guess what no one makes a NVLink 5 slot bridge.
Excellent point. I have noticed the exact same thing. It was so much easier to set up SLI before NVLink was introduced. When I was running tri sli all I had to use was long, flexible SLI bridges. Now we are locked into choosing a particular slot layout, determined by what bridge we are using, on what particular motherboard. Too many unnecessary restrictions. Just my opinion of course.
 
I noticed someone complaining that Nvidia is forcing SLI out in order to profit the most. They showed a 'curve' where each generation saw a class of GPU being removed from SLI support. He argued that a 3070 in SLI would slay a 3090 for significantly less money. He blamed software companies for being lazy and not supporting it better.

But SLI isn't just about software support. It requires more power, more cooling, more space, more cables, more VRAM (unless they actually applied the pooling of DX12/Vulkan), and it introduces worse 0.1 and 1% low frame rates, as well as a higher rates of system failure and lower overclocks.

Did software developers choose to stop supporting SLI/Xfire because of these drawbacks, or was it Nvidia wanting to earn more money from single GPUs and shoehorning it out?
 
I noticed someone complaining that Nvidia is forcing SLI out in order to profit the most. They showed a 'curve' where each generation saw a class of GPU being removed from SLI support. He argued that a 3070 in SLI would slay a 3090 for significantly less money. He blamed software companies for being lazy and not supporting it better.

But SLI isn't just about software support. It requires more power, more cooling, more space, more cables, more VRAM (unless they actually applied the pooling of DX12/Vulkan), and it introduces worse 0.1 and 1% low frame rates, as well as a higher rates of system failure and lower overclocks.

Did software developers choose to stop supporting SLI/Xfire because of these drawbacks, or was it Nvidia wanting to earn more money from single GPUs and shoehorning it out?
Nvidia doesn't care if they sell one big or two small cards. If they could make SLI work they would. If they sell 2 3070s instead of one 3080 they still sold two cards instead of one. More units sold more happy investors.

Margins for all cards excluding 3090 are similar. So more cards sold equals more profit.
 
Did software developers choose to stop supporting SLI/Xfire because of these drawbacks, or was it Nvidia wanting to earn more money from single GPUs and shoehorning it out?

Yes. I think the developer costs to provide SLI support was simply not worth it for them. It was a time when we saw so many titles pushed out prematurely.

AC Unity, Batman AK, almost anything from EA etc.

Interestingly, Battlefront II and Battlefield 5 supported SLI perfectly. However, it provided cheaters in muliplayer with a means of having a massive advantage over others, and EA were not able to fix that. So they blocked SLI in the bootup of the games. You had to set some nvidia files as read only to prevent EA games overriding your settings and disabling SLI.

The population running SLI was just too small for them to justify it. I also do put it down to lack of experience and laziness. If CDPR can push out titles like Witcher 3 showing such amazing SLI scaling, when others struggle after claiming their title supported it, I think that some developers are simply not as good as others.
 
My current SLI build is a couple of RTX Titans and I prefer to keep them on their stock air coolers.

If they were a lot cheaper then I would water cool them.

Another problem with modern SLI is the bridge. NVidia only make 3 and 4 slot ones, on modern motherboards the 4 slot one is useless as the slots are 3 and 5 apart and guess what no one makes a NVLink 5 slot bridge.


My uncle has SLI'd RTX Titans and said it's really not worth it due to devs abandoning mGPU hence why he's only getting 1 x 3090.
 
Besides all of that, the focus now on input latency makes SLI far less valuable to people, it might give a visual FPS increase but it does not bring any frame latency gains over a single card, people aren't buying 144hz screens to get the input latency of 72fps, and that's before you get into microsuttering issues or whatever. It's a technology that was useful in its time but hasn't aged well at all, the drawbacks far too obvious now and we can make huge GPUs compared to pre-2013.
 
Last edited:
Back
Top