Well I don't know, maybe things have improved recently. I have several systems with both SLI and tri SLI configs as well as crossfire and trifire configs (only XDMA variants though) and they are all performing fine, a few problems every so often with drivers, however generally when anyone is over and uses one of my PCs we are able to have a decent time. Unless of course its a very recent release etc.
Last August I bought two Titan Blacks. I had always, always had a much better time running SLI than Crossfire. I depended on them to power my 4k display.. Stupid of me.
First up Nvidia did the "Kepler Derp". No it's not a stupid American dance craze (even though it sounds like one

) and Witcher 3 was running at around 20 FPS. So then they fixed SLI but broke SLI + Gsync. I was getting the most awful motion flicker I have ever experienced. So much it made me feel ill. In four months of waiting it was not fixed, and there were numerous other issues with SLI + Gsync.
Wolfenstein : TNO does not even support SLI and that's pretty much that. There were also a few other games I came across that had no support whatsoever..
So I decided enough was enough and for the first time since my GTX 470 in 2010 I decided to ditch SLI.
Looking around the forums at OCUK (where things move about ten million times more quickly than here) it's pretty obvious that Fallout 4 isn't working properly with SLI. There have been reported issues with other games too, most notably MGSV The Phantom Pain and others. So basically all new games are not supported at launch.
The laughable thing is that Fallout 4 is a Gameworks title. What? so Nvidia couldn't even be bothered to make sure SLI was running before the game was launched? even though it was 'their' game? I checked yesterday and apparently it still doesn't work properly...
And Crossfire? take all of those problems and multiply them by ten. Two people running Fury X Crossfire (and that's a lot for Fury X owners) have both thrown in the towel and put one of their GPUs up for sale.
This is the absolute worst I have ever seen it tbh. I don't know if it's because neither AMD or Nvidia have a dual core single card out? maybe that's why they're not bothered? or maybe it's because the user base is actually so bloody low that it's not even worth bothering with any more?
I saw a study last year that said that there were about 300k users of multi GPUs worldwide. Whilst it was never proven beyond doubt what if it's true?
If you were AMD and Nvidia would you bother to spend money developing for such a tiny user base?
I've kind of taken a risk right now by buying a very rare Asus 760 Mars card. I got it cheaper than either a 780 or 780ti (£130-£150 and £170-£200 respectively) at £120 delivered, fully boxed and mint. I'm praying that support is added for Fallout 4, as that's what I bought it for..
Apparently you can do it yourself using Nvinspector or whatever it's called but it's still not right as the shadows make SLI performance tank.
Time will tell, but I will never go back to relying on multi GPU technology until either DX12 does it at a low level or AMD and Nvidia can be bothered to support their technology.