Nvidia Geforce GTX 1180/2080 PCB leaks - No SLI fingers?

This is interesting if they're going NVLink for future cards. That provides a heck of a lot more bandwidth. Perhaps enough to make two GPU's appear as one virtual GPU for a universal multi-card graphics solution? (I can dream).
 
I've said it so many times now I should have it tattooed on my forehead. SLi as we knew it (and Crossfire) are dead. I am glad Nvidia are finally putting it to bed. Might stop people wasting money.
 
I've said it so many times now I should have it tattooed on my forehead. SLi as we knew it (and Crossfire) are dead. I am glad Nvidia are finally putting it to bed. Might stop people wasting money.

You gotta admit though that sli did look pretty :) , especially with a waterblock linking the two.
 
I've said it so many times now I should have it tattooed on my forehead. SLi as we knew it (and Crossfire) are dead. I am glad Nvidia are finally putting it to bed. Might stop people wasting money.

poseidons look pretty in sli but yet awful expense to look pretty :p
 
I've said it so many times now I should have it tattooed on my forehead. SLi as we knew it (and Crossfire) are dead. I am glad Nvidia are finally putting it to bed. Might stop people wasting money.

I agree completely and I've also been saying it for ages.

I had SLI'd 7800 GT's 260's 480's 780's 970's etc - But with the 1080 Ti. I thought what is the point? most of the games I played didn't work with SLI.

It's a shame because I did like the look of SLI in a system, it made it look more filled up etc - But as you say, waste of money. I think the golden age for SLI was around 2013 (GTX 780 era). From there it was downhill.
 
Could be professional cards? No way consumers get NVlink.

If you look at it, they have altered the arrangement of the connector. The smaller piece that used to be on the right is now on the left.

I believe NVIDIA has done this because it then makes the bridges incompatible with each other. So the NVLink for consumer cards will use a cheap $40 bridge. While the bridges for Quadros and Teslas will continue to use the $400 bridge (this is actually the real price of NVLink bridges right now, it's an insane markup for what amounts to a PCB with some traces in it).

So by changing the connector arrangement NVIDIA can reuse their NVLink development for consumers but not hurt their NVLink bridge sales for the professional market. It's smart and shady, thus believable for NVIDIA :D
 
If you look at it, they have altered the arrangement of the connector. The smaller piece that used to be on the right is now on the left.

I believe NVIDIA has done this because it then makes the bridges incompatible with each other. So the NVLink for consumer cards will use a cheap $40 bridge. While the bridges for Quadros and Teslas will continue to use the $400 bridge (this is actually the real price of NVLink bridges right now, it's an insane markup for what amounts to a PCB with some traces in it).

So by changing the connector arrangement NVIDIA can reuse their NVLink development for consumers but not hurt their NVLink bridge sales for the professional market. It's smart and shady, thus believable for NVIDIA :D

Exactly.
 
I'm fairly happy with SLI. The limited games I play make use of it. Going back to a single card would be too weird. I've been using dual GPUs forever.
 
I'm fairly happy with SLI. The limited games I play make use of it. Going back to a single card would be too weird. I've been using dual GPUs forever.

You get used to it pretty quickly.

One of the things I noticed going from two cards to one is the smoothness and frame times. I never noticed it when I used SLI all the time, I guess I adjusted or because I'd used SLI so long (almost 10 years).

But going to a single card I did notice less frame stuttering I guess? It's hard to explain it but you can tell the difference or at-least I could pretty quickly after playing games I'd known inside and out with only one card instead of two in SLI mode.

I don't wanna on SLI too heavily I do think it served a purpose but I do feel NVIDIA needs to up their game with regards to SLI, it feels very left by the wayside and when you're paying upwards of £1250 for two cards in SLI it really sucks to have one sitting idle or barely helping for a lot of great games.
 
You get used to it pretty quickly.

One of the things I noticed going from two cards to one is the smoothness and frame times. I never noticed it when I used SLI all the time, I guess I adjusted or because I'd used SLI so long (almost 10 years).

But going to a single card I did notice less frame stuttering I guess? It's hard to explain it but you can tell the difference or at-least I could pretty quickly after playing games I'd known inside and out with only one card instead of two in SLI mode.

I don't wanna on SLI too heavily I do think it served a purpose but I do feel NVIDIA needs to up their game with regards to SLI, it feels very left by the wayside and when you're paying upwards of £1250 for two cards in SLI it really sucks to have one sitting idle or barely helping for a lot of great games.

I didn't get used to it. I never play games less than max details. Tried but just couldnt get used to it. I know gameplay > visuals but I have always had SLI and I think its always gonna be that way for me. I bought another 1080ti card and quite happy with the results.
FF15 with all of Nvidia works enabled is a nice looking game, even maintaining 65fps on widescreen monitor.

I thought Witcher 3 was beautiful on 1 card. Using the second just blew me away. Doing a complete new playthrough yet again.

But while SLI is not very popular now, it still has many optimised games utilising it. Don't think its quite dead yet. 3 and 4 way yes, even support has been removed, but 2 way will revert back to a niche crowd for sure, even more than before now.

The biggest surprise and something that makes no sense to me was World of Warcraft. Considering its a CPU intensive game, with 1x 1080ti I was getting 70fps in the main city with all players gathered around during peak time. With the second card I had 100.

Leaving main city and playing as the game goes im getting 150fps. And that is on max setting value "10". Disabling SLI it drops to 80fps.
 
Last edited:
I didn't get used to it. I never play games less than max details. Tried but just couldnt get used to it. I know gameplay > visuals but I have always had SLI and I think its always gonna be that way for me. I bought another 1080ti card and quite happy with the results.
FF15 with all of Nvidia works enabled is a nice looking game, even maintaining 65fps on widescreen monitor.

I thought Witcher 3 was beautiful on 1 card. Using the second just blew me away. Doing a complete new playthrough yet again.

But while SLI is not very popular now, it still has many optimised games utilising it. Don't think its quite dead yet. 3 and 4 way yes, even support has been removed, but 2 way will revert back to a niche crowd for sure, even more than before now.

The biggest surprise and something that makes no sense to me was World of Warcraft. Considering its a CPU intensive game, with 1x 1080ti I was getting 70fps in the main city with all players gathered around during peak time. With the second card I had 100.

Leaving main city and playing as the game goes im getting 150fps. And that is on max setting value "10". Disabling SLI it drops to 80fps.

A 1080ti can easily max out Witcher 3. How on earth does adding another 1080ti make it more beautiful?
 
I agree completely and I've also been saying it for ages.

I had SLI'd 7800 GT's 260's 480's 780's 970's etc - But with the 1080 Ti. I thought what is the point? most of the games I played didn't work with SLI.

It's a shame because I did like the look of SLI in a system, it made it look more filled up etc - But as you say, waste of money. I think the golden age for SLI was around 2013 (GTX 780 era). From there it was downhill.

SLi peaked at the 400-600 series IMO. I would imagine it's because of the 590 and 690. Nvidia had to go to a lot of trouble to make sure those cards would actually work.

Back then everything seemed to work with SLi. Gosh, I remember when you could get a pair of 460s for something silly like £200 and they would cake walk a 480 (which cost £450 or something daft like that). But yeah, once the 700 series hit it started going down hill fast. By the time my Titan Blacks were a year old it was turning into a nightmare.

It does look amazing (as others have said) I gotta agree. I mean, it was mostly a waste of money (the real fun was stuff like 460 SLi but Nvidia soon stopped that fun by removing the fingers on cheap cards :( ) but yeah, nothing more imposing than two (or three, I had three TB at one point !) GPUs in your rig.
 
I didn't get used to it. I never play games less than max details. Tried but just couldnt get used to it. I know gameplay > visuals but I have always had SLI and I think its always gonna be that way for me. I bought another 1080ti card and quite happy with the results.
FF15 with all of Nvidia works enabled is a nice looking game, even maintaining 65fps on widescreen monitor.

I thought Witcher 3 was beautiful on 1 card. Using the second just blew me away. Doing a complete new playthrough yet again.

But while SLI is not very popular now, it still has many optimised games utilising it. Don't think its quite dead yet. 3 and 4 way yes, even support has been removed, but 2 way will revert back to a niche crowd for sure, even more than before now.

The biggest surprise and something that makes no sense to me was World of Warcraft. Considering its a CPU intensive game, with 1x 1080ti I was getting 70fps in the main city with all players gathered around during peak time. With the second card I had 100.

Leaving main city and playing as the game goes im getting 150fps. And that is on max setting value "10". Disabling SLI it drops to 80fps.

Eh I don't really agree to be honest. There's no game I've played on one overclocked 1080 Ti I can't run at max settings above 60 FPS and I'll take the smoothness and instant frame times over the SLI microstutter. I mean even GTAV maxed out now I'm getting 109 FPS (I have a 165 FPS Monitor so that kind of frame rate is useful).

Also about half the games I play don't work with SLI. Minecraft (with shaders it could do with the horsepower), ARK, Batman Arkham Knight. I even had problems getting it to work with Ashes of the Singularity which is a game I play frequently. I completed many Triple A titles before SLI profiles were available from NVIDIA (Far Cry 3 and 4 both had late profiles for me that came after I'd already finished them which was vexing).

Also SLI doesn't work with DX12. So going forward the games that will work with it will become sparse. For DX12 multi-GPU support you're relying on the game developer to do the extra work to enable it for you. Basically they have to add code to the game to explicitly support multiple graphics cards at the DX12 API level which is separated from SLI which NVIDIA control.

That means, not many games will use both cards, it's a niche not worth their time to implement, they don't see any of the money that NVIDIA is making by selling us two cards, just the extra development cost to implement multi-gpu support.

To me it has become just a giant waste of money but I wish it wasn't so.
 
I have no microstutter at all with my SLI setup. But I don't play tons of games either. Fallout 4 never stuttered at 4K, Elite Dangerous is buttery smooth at well over 200fps at 1440p, etc. A single card is still not enough for 4K res, I don't care what anyone says. I have a 4K panel and I've seen it in person. SLI when *properly implemented* works great.
 
I never noticed the microstutter either, until I got rid of SLI and noticed how much smoother everything was. Your brain adjusts.
 
A single card is still not enough for 4K res, I don't care what anyone says. I have a 4K panel and I've seen it in person. SLI when *properly implemented* works great.

I totally, 100% disagree with that dude. Well, unless somehow running DSR to 4k is somehow different to actual 4k (which I doubt).

The last part of what you said? has always been the problem. "When properly implemented" which has mostly been never. What I mean is, pretty much no games were ever designed to work with AFR from "the ground up" and thus it was always implemented after the fact. If a game were designed for SLi? it would scale perfectly. Literally you could double performance.

Sadly SLi was dumped ages ago, but silently. IE - unlike AMD who basically came out and said "The Pro Duo is not for gaming DO NOT use it for gaming (IE we are pulling the plug on Crossfire)" Nvidia just haven't said anything.

But it is like I said, as soon as Nvidia stopped making dual GPU cards (and I don't count that Titan thing as pretty much no one bought it) they stopped pushing devs to implement support. And from there? it was pretty much over.

SLi/Crossfire do not work in DX12, period. That is because DX12 has its own method/s, one that IIRC no one has supported AT ALL. Mostly because DX12 is poo. Well, not poo, just that no one can be arsed using the features it has because they don't apply to a console.
 
Back
Top