390 crossfire?

For most people it's overkill for 1080p, however I do see your point about hitting high framerates on high refresh rate monitors, and if you have the money for a pair of 980Tis then by all means go for it and have fun! You'll then future proof yourself if you later upgrade to a 1440p high refresh rate monitor.

My personal opinion is that the Fury X is a better choice than the 980Ti but then I've always been Team Red since the days of the ATI Rage Pro 128. I also know that Freesync monitors are cheaper than Gsync monitors, which again makes the red cards more attractive to me. But if you're a fully fledged member of the Green Team, I know you'll be equally as happy with a pair of 980Tis.
 
I already did say that i'd upgrade the PSU if I needed to I also said in my last post that I am just going to wait to upgrade my whole PC, I never really wanted to put AMD cards in my PC was just looking at it as an option just for now.

Also I don't understand what your issue with 2 top end cards for 1080p, 1 it will guarantee 120+ fps at ultra settings, 2 it will mean longer before I have to upgrade again.

No, it won't guarantee 120FPS dude.

Take a look at this post yesterday from a guy running two 980tis @ 1440p.

ive tried my new 980ti's and i cant seem to get above 50% usage of each even as low as 30% what gives has nvidia dropped its sli support? i remeber it being rather good when i had 670's 90+%

I won't link you to the site as Tom doesn't like it but you can see that he is only hitting 50% usage on each card.

The reason for this is that when you run lower resolutions the GPU/s would never be the issue. The problem comes from the CPU, and it's because it can't keep up. No CPU in the world could ever keep up with two 390s properly let alone two 980tis. It's not so bad at 4k because at 4k it becomes all about the graphics cards but at lower resolutions the CPU needs to be able to keep up (where as at 4k it does the same job whilst the GPUs do ten times their normal work).

But even that aside no game out there is coded well enough for you to get 120FPS minimum no matter what GPU set up you have. You are still going to get minimums, and adding two cards just makes it worse. Minimums on SLI/CFX set ups are usually pretty bad.

This is why each manufacturer makes cards for 1080p, 1440p and now sort of 4k.

If you want the best possible experience then simply upgrade to a 980ti (avoiding the Fury line at 1080p for reasons already explained) and just use that. That is massive overkill for 1080p.

Running any sort of multiple GPU system is always a headache. Certain games do not work at all with CFX/SLI and others can end up getting you less FPS than one card. It's been pretty bad of late, with Nvidia and AMD really resting back and not bothering too much about their multi GPU tech.

I ran multiple GPUs between 2008 and earlier this year. However toward the end it was so bad (SLI, I was having issues with SLI, SLI and Gsync and so on) that I threw in the towel.

And that's why I can also tell you that running 'open fan' cards in pairs is a fail. I had two GTX 670s (known for being cucumber cool) in my rig before and one was 68c and the other was hitting 103c and shutting the rig down.

Only by moving them as far apart as I physically could (and dropping one to X4 and having to hack it back into action) and taking the side off my case was I able to get it to behave. And even then I would see high 80s, really not good for mid range Kepler.
 
Multi GPU configs aren't exactly as bad as the above post makes it appear. Either way, I would say get a decent variant of the 980ti and overclock it OR wait for the next gen GPUs which will undoubtably offer a sizable increase in performance and better support for DX12 (Will help with the CPU).
 
Multi GPU configs aren't exactly as bad as the above post makes it appear..

They're the worst they have ever been. Maybe things will improve once AMD and Nvidia have a reason to bother with profiles but as of right now they completely stink.
 
Well I don't know, maybe things have improved recently. I have several systems with both SLI and tri SLI configs as well as crossfire and trifire configs (only XDMA variants though) and they are all performing fine, a few problems every so often with drivers, however generally when anyone is over and uses one of my PCs we are able to have a decent time. Unless of course its a very recent release etc.
 
Well I don't know, maybe things have improved recently. I have several systems with both SLI and tri SLI configs as well as crossfire and trifire configs (only XDMA variants though) and they are all performing fine, a few problems every so often with drivers, however generally when anyone is over and uses one of my PCs we are able to have a decent time. Unless of course its a very recent release etc.

Last August I bought two Titan Blacks. I had always, always had a much better time running SLI than Crossfire. I depended on them to power my 4k display.. Stupid of me.

First up Nvidia did the "Kepler Derp". No it's not a stupid American dance craze (even though it sounds like one :D ) and Witcher 3 was running at around 20 FPS. So then they fixed SLI but broke SLI + Gsync. I was getting the most awful motion flicker I have ever experienced. So much it made me feel ill. In four months of waiting it was not fixed, and there were numerous other issues with SLI + Gsync.

Wolfenstein : TNO does not even support SLI and that's pretty much that. There were also a few other games I came across that had no support whatsoever..

So I decided enough was enough and for the first time since my GTX 470 in 2010 I decided to ditch SLI.

Looking around the forums at OCUK (where things move about ten million times more quickly than here) it's pretty obvious that Fallout 4 isn't working properly with SLI. There have been reported issues with other games too, most notably MGSV The Phantom Pain and others. So basically all new games are not supported at launch.

The laughable thing is that Fallout 4 is a Gameworks title. What? so Nvidia couldn't even be bothered to make sure SLI was running before the game was launched? even though it was 'their' game? I checked yesterday and apparently it still doesn't work properly...

And Crossfire? take all of those problems and multiply them by ten. Two people running Fury X Crossfire (and that's a lot for Fury X owners) have both thrown in the towel and put one of their GPUs up for sale.

This is the absolute worst I have ever seen it tbh. I don't know if it's because neither AMD or Nvidia have a dual core single card out? maybe that's why they're not bothered? or maybe it's because the user base is actually so bloody low that it's not even worth bothering with any more?

I saw a study last year that said that there were about 300k users of multi GPUs worldwide. Whilst it was never proven beyond doubt what if it's true?

If you were AMD and Nvidia would you bother to spend money developing for such a tiny user base?

I've kind of taken a risk right now by buying a very rare Asus 760 Mars card. I got it cheaper than either a 780 or 780ti (£130-£150 and £170-£200 respectively) at £120 delivered, fully boxed and mint. I'm praying that support is added for Fallout 4, as that's what I bought it for..

Apparently you can do it yourself using Nvinspector or whatever it's called but it's still not right as the shadows make SLI performance tank.

Time will tell, but I will never go back to relying on multi GPU technology until either DX12 does it at a low level or AMD and Nvidia can be bothered to support their technology.
 
Looking around the forums at OCUK (where things move about ten million times more quickly than here) it's pretty obvious that Fallout 4 isn't working properly with SLI. There have been reported issues with other games too, most notably MGSV The Phantom Pain and others. So basically all new games are not supported at launch.

The laughable thing is that Fallout 4 is a Gameworks title. What? so Nvidia couldn't even be bothered to make sure SLI was running before the game was launched? even though it was 'their' game? I checked yesterday and apparently it still doesn't work properly...

When was the last time a game actually released with SLI/Crossfire support out of the box? I can't remember the last time that happened?

Bethesda are partially responsible for the lack of SLI support and when was the last time a gameworks title actively wasn't crap performance wise? The very nature of gameworks means that games with it are usually laggy pieces of junk.

Also I'd not say OCUK is the best sample to take from as it still has a chunk of customer support bits.
 
When was the last time a game actually released with SLI/Crossfire support out of the box? I can't remember the last time that happened?

Bethesda are partially responsible for the lack of SLI support and when was the last time a gameworks title actively wasn't crap performance wise? The very nature of gameworks means that games with it are usually laggy pieces of junk.

Also I'd not say OCUK is the best sample to take from as it still has a chunk of customer support bits.

What deepens the plot even more is that Skyrim worked out of the box and that wasn't anything to do with Nvidia. I mean crap dude, even Fallout 3 worked out of the box with Crossfire AFAICR. I'm pretty certain I used to get a jammed 60 FPS out of FO3 with 5770s in CFX running ultra and 8xaa.

But for Nvidia to allow a Gameworks title to release without even having SLI support? that's pretty poor.

I don't know dude I can't quite put my finger on it but I could swear that maybe Nvidia have finally seen sense and are putting a stop to all of the pointless tack on technologies they came up with over the past few years....

Even EVGA have totally stopped doing their SLI enhancement patches. So if EVGA are not bothering (and they are literally at POS with the customer) then it doesn't sound too good.

What annoys me even more is that AMD have two reps (Matt and Joe) who post on OCUK but when asked about Crossfire profiles they pucker up and go silent.

Crossfire right now is seriously, seriously bad. Maybe with the release of the X2 Fury (if it ever sees daylight) they might actually do something about the situation but yeah, pretty dire right now. I was thinking go with a Fury Nano/Fury/another X but from all of the stuff I have seen on OCUK it's put me right off.
 
You keep saying crossfire is bad all over the forum yet in there drivers they keep posting new support for it. Especially with Freesync in the Crimson driver. Nvidia Still support it too..
 
Maybe your expectations are set a tad too high? I can understand the frustration, however it is something you should have accounted for when deciding on the multi GPU route. I did mention in my post that new releases didn't always work properly.

As of late Ive seen a tad more issues with NVidia multi GPU configs as opposed to AMD ones (at least from the more experienced/knowledgeable users).
 
What deepens the plot even more is that Skyrim worked out of the box and that wasn't anything to do with Nvidia. I mean crap dude, even Fallout 3 worked out of the box with Crossfire AFAICR. I'm pretty certain I used to get a jammed 60 FPS out of FO3 with 5770s in CFX running ultra and 8xaa.

But for Nvidia to allow a Gameworks title to release without even having SLI support? that's pretty poor.

I don't know dude I can't quite put my finger on it but I could swear that maybe Nvidia have finally seen sense and are putting a stop to all of the pointless tack on technologies they came up with over the past few years....

Even EVGA have totally stopped doing their SLI enhancement patches. So if EVGA are not bothering (and they are literally at POS with the customer) then it doesn't sound too good.

What annoys me even more is that AMD have two reps (Matt and Joe) who post on OCUK but when asked about Crossfire profiles they pucker up and go silent.

Crossfire right now is seriously, seriously bad. Maybe with the release of the X2 Fury (if it ever sees daylight) they might actually do something about the situation but yeah, pretty dire right now. I was thinking go with a Fury Nano/Fury/another X but from all of the stuff I have seen on OCUK it's put me right off.

5770 CF is like ancient technology these days 390's (new amd cards) dont even need CF bridges.
 
Last edited:
Crossfire and SLI is a cool notion, but that's all it is to me. It's impractical, non-essential except at 4K (which I think is non-essential and over-hyped itself), expensive, and isn't supported well. Even at 1440p, a 980 or Fury can comfortably reach decent FPS with settings at very high, or with very good FPS at slightly lower settings. I do feel the 980/Fury is not quite there yet, but Pascal/Arctic Islands will fix that—and there is the 980ti if you want stellar performance at the sacrifice of cost and temperatures. Once the next generation of GPU's are out, there should be no need for SLI/Crossfire. At 1080p, 2560x1080p, 1440p, 3440x1440p, a single card will be enough. As far as 4K, it's overrated. It looks stunning, but so does 1440p. In fact, I'd prefer 90 FPS at 1440p with AAx4 than 4k at 60 FPS with no anti-aliasing. You still need textures up very high, anti-aliasing, lightning effects, shadows, etc. The idea that at 4k you can turn everything down and not tell a difference is a folly in my opinion. Some games will work well without AA at 4K, but it entirely depends on the coding of the game and how the objects and polygons are rendered. At 1440p, for instance, the cars in GTA V look abysmal without AA. At 4K it's only slightly better. I have a Fury at 1440p and I'm stuck with FXAA because the performance just isn't quite there yet. Out and about FXAA is fine, but in your garage admiring your cars, everything looks like a miniaturized, glorified stair set in a Barbie doll house. I'm eagerly awaiting Arctic Islands, because I don't want to pay a premium for a Fury X or run Crossfire. I also now own a Freesync monitor and do not want to pay for a 980ti and G-Sync panel.
 
Crossfire and SLI is a cool notion, but that's all it is to me. It's impractical, non-essential except at 4K (which I think is non-essential and over-hyped itself), expensive, and isn't supported well. Even at 1440p, a 980 or Fury can comfortably reach decent FPS with settings at very high, or with very good FPS at slightly lower settings. I do feel the 980/Fury is not quite there yet, but Pascal/Arctic Islands will fix that—and there is the 980ti if you want stellar performance at the sacrifice of cost and temperatures. Once the next generation of GPU's are out, there should be no need for SLI/Crossfire. At 1080p, 2560x1080p, 1440p, 3440x1440p, a single card will be enough. As far as 4K, it's overrated. It looks stunning, but so does 1440p. In fact, I'd prefer 90 FPS at 1440p with AAx4 than 4k at 60 FPS with no anti-aliasing. You still need textures up very high, anti-aliasing, lightning effects, shadows, etc. The idea that at 4k you can turn everything down and not tell a difference is a folly in my opinion. Some games will work well without AA at 4K, but it entirely depends on the coding of the game and how the objects and polygons are rendered. At 1440p, for instance, the cars in GTA V look abysmal without AA. At 4K it's only slightly better. I have a Fury at 1440p and I'm stuck with FXAA because the performance just isn't quite there yet. Out and about FXAA is fine, but in your garage admiring your cars, everything looks like a miniaturized, glorified stair set in a Barbie doll house. I'm eagerly awaiting Arctic Islands, because I don't want to pay a premium for a Fury X or run Crossfire. I also now own a Freesync monitor and do not want to pay for a 980ti and G-Sync panel.

Pretty much the whole world apart from people using SLI and Crossfire pretty much agree with you. Which is why lately they've both been rather pants.

I tried SLI for 4k and it's during moments like that you feel completely at the mercy of software devs and Nvidia who both really couldn't care less lately. Fallout 4 is a Gameworks title and at the time of writing this does not work properly with SLI, with many reporting serious issues with shadows and other anomalies causing massive FPS dips.

Hilariously Skyrim which was not a Nvidia Gameworks title worked perfectly fine for me.

Others have mentioned "release day support" in this thread and I will completely agree, near on every single game released over the past year has not worked on day one release day with SLI and Crossfire. This then means you have to wait around hoping that support comes. Far Cry 4's engine hates SLI and thus it took almost forever for support to be added. Then add on the games that simply won't have any of it and you're left sitting around waiting whilst every one else is having tons of fun.

Between 2008 and recently both Nvidia and AMD got onboard with the game devs and made sure that day one support was good. AMD with their Gaming Evolved and Nvidia with Gameworks. However, AMD seem to have done a bunk and really couldn't care any more (see the Project : Cars debacle where the main author of the game sent AMD a Steam code about a week after launch) and so on.

Now when I said I used to be a massive supporter of both technologies I meant it. Ever since 2008 right up until around four months ago I used either SLI or Crossfire. I went away from Crossfire (knowing it was crap) until AMD were caught out and sorted that out, but for many years they knowingly sold people two GPUs which they knew would not work properly. I bought a new 7990 and I guess it was OK but nothing mind blowing.

Hardly a glowing endorsement, is it?

I used to depend upon SLI. Especially when Nvidia were the ones to sort of let the cat out of the bag about Crossfire. However toward the end support was really falling behind and I was sick of waiting around to play a new game.

5770 CF is like ancient technology these days 390's (new amd cards) dont even need CF bridges.

I am fully aware of that. The last time I used Crossfire was up until very recently when I had a 7990. In between the 5770s and 7990 I stopped using Crossfire because unlike the brainwashed many I knew it wasn't right. I only bought the 7990 once it had flopped, dropped in price and the drivers were fixed so that it actually worked properly without dropping frames and spitting out pretend runt frames.

The only people endorsing either in this thread are the ones using it. That's hardly making me want to buy another card right about now.
 
Pretty much the whole world apart from people using SLI and Crossfire pretty much agree with you. Which is why lately they've both been rather pants.

I tried SLI for 4k and it's during moments like that you feel completely at the mercy of software devs and Nvidia who both really couldn't care less lately. Fallout 4 is a Gameworks title and at the time of writing this does not work properly with SLI, with many reporting serious issues with shadows and other anomalies causing massive FPS dips.

Hilariously Skyrim which was not a Nvidia Gameworks title worked perfectly fine for me.

Others have mentioned "release day support" in this thread and I will completely agree, near on every single game released over the past year has not worked on day one release day with SLI and Crossfire. This then means you have to wait around hoping that support comes. Far Cry 4's engine hates SLI and thus it took almost forever for support to be added. Then add on the games that simply won't have any of it and you're left sitting around waiting whilst every one else is having tons of fun.

Between 2008 and recently both Nvidia and AMD got onboard with the game devs and made sure that day one support was good. AMD with their Gaming Evolved and Nvidia with Gameworks. However, AMD seem to have done a bunk and really couldn't care any more (see the Project : Cars debacle where the main author of the game sent AMD a Steam code about a week after launch) and so on.

Now when I said I used to be a massive supporter of both technologies I meant it. Ever since 2008 right up until around four months ago I used either SLI or Crossfire. I went away from Crossfire (knowing it was crap) until AMD were caught out and sorted that out, but for many years they knowingly sold people two GPUs which they knew would not work properly. I bought a new 7990 and I guess it was OK but nothing mind blowing.

Hardly a glowing endorsement, is it?

I used to depend upon SLI. Especially when Nvidia were the ones to sort of let the cat out of the bag about Crossfire. However toward the end support was really falling behind and I was sick of waiting around to play a new game.



I am fully aware of that. The last time I used Crossfire was up until very recently when I had a 7990. In between the 5770s and 7990 I stopped using Crossfire because unlike the brainwashed many I knew it wasn't right. I only bought the 7990 once it had flopped, dropped in price and the drivers were fixed so that it actually worked properly without dropping frames and spitting out pretend runt frames.

The only people endorsing either in this thread are the ones using it. That's hardly making me want to buy another card right about now.
I agree. The condoning of SLI/Crossfire is generally from people who are in love with it through thick and thin, or those who absolutely need it. Unless you are dead set on moar pixelz, I don't think SLI/Crossfire is needed any more. Before, we needed 480 SLI to run Crysis at 1080p 60FPS. Now we only need to overclock our single 980ti to run 1440p 60FPS in Crysis 3 with a couple of settings turned down. And that's as demanding as it gets.

If all games were coded as well as Tomb Raider was for Crossfire and SLI, or if it scaled like it does in Fire Strike, then I would 100% recommend it, but they're not. As you said, they're getting worse. And dual GPU's are more expensive on release as well. The worst of it was the Titan Z, which was an extra $1000 than it should have been. The 295X2 was $1500, an extra $500. Compare that to the 690 being twice as expensive as the 680, which makes sense. The same for the 7990. The 590 and 6990 were actually cheaper than two of their respective GPU's. Everything has just risen in price. Performance is excellent, but only in single GPU setups, as you probably know. I don't understand where AMD nVidia are going with things. The 295X2 was an excellent GPU conceptually, but performance has been inconsistent, as you say.
 
Back
Top