Go Back   OC3D Forums > [OC3D] Graphics & Displays > Graphics - Other
Reply
 
Thread Tools Display Modes
 
  #1  
Old 09-04-19, 10:28 PM
ScottytooHotty ScottytooHotty is offline
Newbie
 
Join Date: Dec 2012
Posts: 68
Is SLI still worth it?

I want to upgrade so I can get more FPS in games and I know it's better to have one top of the line card over to lesser cards but I can't afford a 2080ti but can probably afford another 1080, but just want to know if there are any real big issues doing it?

__________________
Current Setup: Phanteks Enthoo Pro M Window Edition - i7 8700K 4.7ghz - ASUS ROG Strix Z-370 H gaming mother board - NZXT Kraken x72 AIO - 16gb Corsair Vengence RGB 3000mhz - Asus Strix GTX1080 - 500gb 970 EVO M.2 - 250gb Sandisk M.2 - 250gb 850 EVO - 120gb Samsung 840 - 60GB OCZ Vertex - 1TB Seagate Barracuda - Corsair RM850x - Asus PG279 27" 165hz - Iiyama 27" 120hz - Corsair Harpoon mouse - Corsair K70 RGB Keyboard - Logitech Z-5500 speakers - Blue Snowball mic.
Reply With Quote
  #2  
Old 09-04-19, 10:44 PM
Warchild Warchild is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Norway, Oslo
Posts: 5,320
Depends on the game. Frostbite engine still supports it so the likes of BF5 does allow SLI (recently disabled but I know a work around)

Witcher 3 still supports SLI, tombraider etc etc.
Ubisoft games dont support it, and Anthem will in the near future.

I have 2x 1080ti with SLi enabled, when it works its great, when ti doesnt I lose an fps or two from single card running with second idle.
Reply With Quote
  #3  
Old 09-04-19, 10:45 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,240
Personally I'd recommend going to a 1080Ti instead if that's the best single card choice as SLI doesn't work with many DX12 or Vulkan titles and even some modern DX11 titles have little to no support, while generally single card solutions are a lot more efficient & depending on case/cooling more quiet.
But even when it does work you'll only get the frame/input latency of a single card with SLI set ups and at the FPS range where they're most stable and least prone to microstutter you're usually it much longer frame time periods.

I'm not sure if this is useful but I made a little picture a while ago to help explain the difference in latency between say 40fps on a single card and on two cards.
Reply With Quote
  #4  
Old 09-04-19, 11:15 PM
NeverBackDown NeverBackDown is offline
AMD Enthusiast
 
Join Date: Dec 2012
Location: With the Asguardians of the Galaxy
Posts: 16,006
With SLI you will never get perfect efficiency in 99.99999999% of games. The ones that do... well I can't even think I've ever seen one tbh. You're likely to get anywhere from 70%-90% efficiency on the few games that use it well.

So you pay double the price for NOT double the performance. You pay for double the power consumption for NOT double the performance increase. You get twice the noise volume for NOT double performance or power consumption. You add more cables to manage and decrease the airflow overall(not all cases will be effected depending on size).

So really all those cons for basically two pros:
1) looks cool
2) benchmarking

I'd just rather have 1 better card. Won't deal with any of the problems and you'll have a more quiet PC alongside better performance/watt.

I'd say you'd be better off selling your 1080 and using that money towards a 2080 if you cannot afford a 2080ti.
That said I wouldn't even bother to upgrade unless you are disappointed with your current performance. . If all the games you play support SLI well and have little issues, then why not I suppose.
__________________
I am Iron Man.
Reply With Quote
  #5  
Old 10-04-19, 07:05 AM
Warchild Warchild is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Norway, Oslo
Posts: 5,320
Quote:
Originally Posted by tgrech View Post
Personally I'd recommend going to a 1080Ti instead if that's the best single card choice as SLI doesn't work with many DX12 or Vulkan titles and even some modern DX11 titles have little to no support, while generally single card solutions are a lot more efficient & depending on case/cooling more quiet.
But even when it does work you'll only get the frame/input latency of a single card with SLI set ups and at the FPS range where they're most stable and least prone to microstutter you're usually it much longer frame time periods.

I'm not sure if this is useful but I made a little picture a while ago to help explain the difference in latency between say 40fps on a single card and on two cards.
Lol looks like a copy of the one Nvidia made a year ago

Does make me wonder though if Nvidia has shot themselves in the foot a little. With the fading support from developers to provide decent SLI performance, we are opting for single card solutions. That means that there is a fall in double purchse sales. On top of that, I would have thought single card sales would also fall as users will feel that a 30% bump over the previous gen is not worth upgrading for, thus skipping the current gen to wait for the next.

In the past, users would buy one or two. For those buying one card, they would buy another later on, knowing that the 2card solution would outperform next gen of same level e.g. 980 to 1080. It feels like Nvidia knowingly see that SLI is no longer favourable, and have intentionally pushed up the price of their RTX cards to squeeze as much as they can from us from those only going with single card setup. Had SLI/Nvlink been the go to recommended setup, most likely you would have seen a lower costing generation of cards.

I know RTX involved millions in R&D, but I am no longer buying the fact that this is the reason for the high cost.
Reply With Quote
  #6  
Old 10-04-19, 08:41 AM
looz's Avatar
looz looz is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Finland
Posts: 1,432
I'd sell the 1080 and buy a 2080, though that's with the local prices - might be totally different for you. 1080 Ti are scarce and more expensive than the 2080 which is IMO the better card of the two.
__________________
i7 8700k - 16GB - 2060 FE - 512GB 970 & 850 EVO - AKG K702
Reply With Quote
  #7  
Old 10-04-19, 08:56 AM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,240
I think it's actually better for NVidia now, with the 1000 series if someone wanted to spend £1k+ or so on GPUs they'd probably have to go SLI. But now NVidia have single card products going way upto £1.1k or whatever, which must have much better margins than two cards because of the manufacturing costs and stuff, as long as GPU dies keep scailing upwards more aggressively it should be ok. Yeah a 2080 is a good choice too or VII depending on what you're going for and local prices.

Also I know I've covered this alot but the RTX2080Ti die is way way larger than a 1080Ti die, they genuinely moved the product stack into a category that consumer gaming cards had never reached before (700mm+ dies). The only similarly sized card before it was the £3000 Titan V, SLI dying isn't the reason the 2080Ti costs so much, it's because it's of the absolute limits of silicon manufacturing(Which is a first for a gaming card really), maybe the death of SLI is partly what motivated them to push for top class single die silicon in consumer markets though.
Reply With Quote
  #8  
Old 10-04-19, 09:49 AM
Kaapstad's Avatar
Kaapstad Kaapstad is offline
OC3D Elite
 
Join Date: Jul 2013
Location: Skaro visiting family
Posts: 1,984
The £2800 Titan V dies pack quite a bit more transistors than Turing.

GV100 = 21.1 billion transistors and die size 815mm2

TU102 = 18.6 billion transistors and die size 754mm2

Volta needs the extra due to its DP cores.
__________________
OC3D Overclockers Club Member
#041

GTX 960 owner and proud of it.
Reply With Quote
  #9  
Old 10-04-19, 09:55 AM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 1,240
The 1080Ti had a 471mm^2 die and 12Bn transistors, that difference(~5% die size) is quite small in the grand scheme of things(Especially when we're talking about a 65% jump).

You gotta remember we're talking about two dimensions here too, a 775mm^2 die vs a 825mm^2 die is the difference between say a 20mm x 41.25mm die and a 20mm x 39.25mm die.

(Volta's cores are loosely the same size as Turing's[Due to the RT cores being comparable in size to the set of DP cores], it just has more CUDA cores (~11% more cores for ~13% more transistors)).

The huge size of those RT cores(1 per SM) is partly why the RTX2080 (545mm^2) is 16% larger than a 1080Ti while packing almost 20% fewer CUDA cores. Turing cores are absolutely giant compared to Pascal, and must be within a hair(literally physically) of Volta.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 09:30 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.