980ti SLI problem

remember300

Active member
Right been a while since I have posted on here. Quite a while to be honest.

So I have got a few bits and pieces for my PC to bring it up to a reasonable spec.
Now I am running:
Intel i7 3770k (at stock till i solve my problems) with H100i
2x 8gb corsair dom plats at 1866
ASUS Maximus V Extreme
Corsair 860axi PSU
plus a few hdds and a samsung 850 pro ssd for OS windows 10 64 bit

Now the bit thats giving me a problem I have 2 EVGA 980ti classifieds which on their own give pretty decent scores. One card i have gave pretty poor ASIC scores and was getting rather hot if it was on top or bottom. I took off the cooler and the thermal paste was dry ... like really dry. it has now been replaced with arctic MX-4. and giving a low OC score on firestrike of just under 20k on the gpu score: http://www.3dmark.com/fs/11419425

However when I put in a second I literally get only 5-8k performance increase on the gpu: http://www.3dmark.com/fs/11419669

When looking online Im seeing scores on the gpu from 30k to 40k I understand being on air gives me a disadvantage but why am i losing so much performance. Bios is upto date and I am using an ASUS sli bridge I got with the MOBO

I've racked my brain but cant work this out :(
 
Last edited:
I would not compare results from the launch of the TI in 3DMARK with the results you get now. When a new card is launched the drivers are totally optimised for the things that count, like 3DMARK.

When I had my Titan Blacks at launch I scored around 19,000 IIRC. In later drivers I was lucky to get 15,000.

I would use something known like Metro 2033 (IE known to kick ass in SLi) and use that as your guide. Or Heaven, and look for the correct gains.

Also. The 3770k is a pretty lame CPU at stock. It may well be holding you back, so concentrate more on the Graphics Score rather than the overall one. Remember, when these cards are ran in 3DMARK they are usually coupled with a highly threaded CPU that will affect the overall score drastically.

I just had a look at your scores and one of them is invalid for the same sort of reasons I get. It could be something to do with the time on your PC.
 
The cpu is only at stock due to me reseating it just to try everything and the reason this dont make sense to me is why so little gains in sli along with the fact my 2 cards barely beat a 1080 when overclocked and everything else i should be getting maybe 30k on graphics. Im not expecting the world but those gains are tiny more so when you look at the investment made into it. I got the same scores for the gpus even when that cpu is at 4.6ghz.
 
The cpu is only at stock due to me reseating it just to try everything and the reason this dont make sense to me is why so little gains in sli along with the fact my 2 cards barely beat a 1080 when overclocked and everything else i should be getting maybe 30k on graphics. Im not expecting the world but those gains are tiny more so when you look at the investment made into it. I got the same scores for the gpus even when that cpu is at 4.6ghz.

To make you buy the newer cards.
 
I tested mine to compare - Same CPU, different mobo and RAM, and only running 980's but I got a similar disparity.

CTW5nq0.png


2 is a Single 980 running PhysX on the GPU (they were both in but not running SLI)

1 is running SLI with PhysX on GPU 2. Just the basic Firestrike out of the box with no changes to settings.

So they scale about the same as yours does. Would be interesting to see if any 1070 or 1080 users have better luck.
 
To make you buy the newer cards.

^ that, sadly. Nvidia do not want you keeping GPUs for years and years they want you buying again.

SLi has taken a severe nose dive lately in pretty much everything. Game devs are not supporting it so Nvidia are lacking themselves. What's the point in working on something that isn't going to work no matter what you do for it?

They've already pulled the plug on 3 and 4 way, wouldn't surprise me at all if they got rid of it completely once Volta arrives. The less people using it the more the waste of money supporting something, especially if you are losing profit because of it overall.

But yeah, as I said just let your games be your guide :)

Oh and that invalid result may have been from overclocking the CPU. I thought about it last night and as soon as I went back to stock it stopped doing it. It's something to do with the RTC (realtime clock) in your computer, where sometimes when you overclock it gets affected and 3Dmark assumes you are cheating.
 
Last edited:
ill run some tests myself and see what i get. I think im on an older driver would be interesting to see what I have

So single card i had 14200
SLI i had 21000.

Scaling between the two isnt that bad in my opinion since its not 100% scaling. What I did notice though, the rendering with the spider bots on screen. I had exactly the same fps in single card and SLI, approx 40fps in both of them. Was a little surprised by that.
 
Last edited:
Im the same i get the spider bots at the same almost in normal extreme and ultra lol.
And profit wise if they dont support sli surely thats gonna hurt their margins ... they can only sell 1 card not 2 or more.
 
Im the same i get the spider bots at the same almost in normal extreme and ultra lol.
And profit wise if they dont support sli surely thats gonna hurt their margins ... they can only sell 1 card not 2 or more.

The amount of people who actually buy more than one card is preposterously low. Something like 3%.

Nvidia have to hire programmers to support SLi and help out people who are trying to make it work. That all costs money.

In business you have to make decisions on whether something is worth it and that you make a profit.

Recent game support says it all IMO.
 
The amount of people who actually buy more than one card is preposterously low. Something like 3%.

Nvidia have to hire programmers to support SLi and help out people who are trying to make it work. That all costs money.

In business you have to make decisions on whether something is worth it and that you make a profit.

Recent game support says it all IMO.

Like me, I've thought of crossfire and SLI, but always stuck to one card. Due to the fact that with just one card, you don't have to worry about the potential problems with SLI.
 
Like me, I've thought of crossfire and SLI, but always stuck to one card. Due to the fact that with just one card, you don't have to worry about the potential problems with SLI.

I used to have two cards all of the time dude. From '09 until last year I had multiple GPUs. It started out slow but gathered momentum. By the time the 670 released support was amazing and you could easily out do a Titan for half of the price. Sadly though when we went to Windows 8 it started to fall to the wayside,

Now we are on Windows 10 and pretty much nobody supports it any more. I wouldn't mind but DX12 has been a severe let down. I wish they had stuck on DX11 and continued with multi GPU support. However, you have to accept that games are made for consoles first and the easier you make it to get that code running on a PC the less people will spend time adding support for things like SLi.

The last multi GPU set up I had was two Fury X and it wasn't that long ago. However, I would never do it again. Not ever.

The heat, the noise, the power use and the problems just made it no fun any more.
 
I used to have two cards all of the time dude. From '09 until last year I had multiple GPUs. It started out slow but gathered momentum. By the time the 670 released support was amazing and you could easily out do a Titan for half of the price. Sadly though when we went to Windows 8 it started to fall to the wayside,

Now we are on Windows 10 and pretty much nobody supports it any more. I wouldn't mind but DX12 has been a severe let down. I wish they had stuck on DX11 and continued with multi GPU support. However, you have to accept that games are made for consoles first and the easier you make it to get that code running on a PC the less people will spend time adding support for things like SLi.

The last multi GPU set up I had was two Fury X and it wasn't that long ago. However, I would never do it again. Not ever.

The heat, the noise, the power use and the problems just made it no fun any more.

I think you hit the nail on the head with the way games are developed these days. It's all about consoles. 30 fps 720p and anything else is a bonus.

I do love the look of dual cards. Maybe next gen I'll get one real one and 3D print a second one just for looks. It's such a shame we're being let down with the supported drivers etc.

Whatever happened to the combined memory pool they were talking about? So 2x 4Gb = 8 GB memory?
 
I think you hit the nail on the head with the way games are developed these days. It's all about consoles. 30 fps 720p and anything else is a bonus.

I do love the look of dual cards. Maybe next gen I'll get one real one and 3D print a second one just for looks. It's such a shame we're being let down with the supported drivers etc.

Whatever happened to the combined memory pool they were talking about? So 2x 4Gb = 8 GB memory?

Whatever happened to all of it dude? DX12 promised so much and has delivered naff all. Games sure don't look any better and won't and performance has tanked. It's a joke tbh.
 
Whatever happened to all of it dude? DX12 promised so much and has delivered naff all. Games sure don't look any better and won't and performance has tanked. It's a joke tbh.

I'm not holding my breath. DX12 will work properly of course - about a year after 13 is released..
 
Well I'm surpised regarding games, didn't know they were firstly designed for consoles and then PCs lol. Now I understand why SLI and crossfire haven't and aren't very optimised.
And why reviewers recommend these days to get "the highest end single card you can afford" and not 2.
 
Back
Top