Frame Rating and AMD crossfire issue

Proofs that it is not the cards fault. Saidly it is the driver causing these issues.

Hopefully they will get a solution for that as Ryan stated:
"AMD is aware of this issue, we have talked with them serveal times about it and they promied us that a solution is comming although propably not for a 60, 90 to 120 day to actually get a global kind of fix for this type of thing."
 
Proofs that it is not the cards fault. Saidly it is the driver causing these issues.

Hopefully they will get a solution for that as Ryan stated:
"AMD is aware of this issue, we have talked with them serveal times about it and they promied us that a solution is comming although propably not for a 60, 90 to 120 day to actually get a global kind of fix for this type of thing."

What shocked me was just how much worse it is with mid ranged cards !!

Even Dirt games suffer on the mid ranged cards which is truly odd.

It's quite amazing to get an insight into why my 5770s were so utterly shocking in Crossfire. I'm hoping Ryan will try some older cards.. He has hinted to it in this latest article.
 
I'm hoping Ryan will try some older cards.. He has hinted to it in this latest article.

I dont know, but i hope so. He has put so much time, money and efford id this whole stoff, i would understand if he does not.

I think we would see similar results, because is is a driver issue.
 
This has been a long known issue. It's only when some major review sites started paying attention to it that everyone suddenly complaints. Not that they don't have the right to, by all means, please do spread the word that FPS-scores are absolutely nót sufficient to rate a graphics card.

Nvidia cards are suffering from these frame-time issues too. AMD's cards are suffering a little more. But they're working on it, step by step, as we speak, with every new driver update.

Some personal exp: My 5770's in crossfire did absolutely great. Never had a problem. Now I run a single GTX660Ti, not a problem either.
 
Last edited:
Drivers can not be 100% perfect. They are too many differrent things a pc can do that you would have to optimize for. But AMD and Nvidia should have no excuse for sending out false frames to achieve "better fps". Only time will tell.
 
Sure, but the fact that multi gpu setups work better for nvidea is not just that the drivers ate "better". Kepler arcitecture has multi gpu support build into its hardware. They use hardware stuff to improve the smoothness of sli. AMD does not and it will mabe take another year or two untill we will see that in amd gpus.

Sure they might be able to improve the xfire technologie to a point where we can they it is "good" but if it is done with hardware it is always going to be better, faster, smoother because it does not have to go through the os and stuff.
 
This has been a long known issue. It's only when some major review sites started paying attention to it that everyone suddenly complaints. Not that they don't have the right to, by all means, please do spread the word that FPS-scores are absolutely nót sufficient to rate a graphics card.

This has become an issue because no one knew about it and thought that their Crossfire systems were kicking ass. No one knew about runt frames and many didn't realise that so many of the FPS they were seeing were dropped either.

Nvidia cards are suffering from these frame-time issues too. AMD's cards are suffering a little more. But they're working on it, step by step, as we speak, with every new driver update.

A little bit more? AMD Crossfire isn't even working properly 50% of the time. SLI however seems to have no such issues.

Some personal exp: My 5770's in crossfire did absolutely great. Never had a problem. Now I run a single GTX660Ti, not a problem either.

Until we see this put to the test you thought you never had a problem. I too ran 5770 CF and yes, my FPS counts were high. However what I got in reality was a stuttering mess so I'm dying to find out if they suffered from the same issue as it would explain an awful lot.
 
It all boils down to money.Both side sides put out their cards every year or every other year,and we run out to buy them.Just for bragging rights......by the way 3770k@4.5ghz 1.2v with Sapphire 7970 Toxic in crossfire with the 3gb version 1200/1600.(ps.....Gtx 680 is chillin in the closet)
 
It all boils down to money.Both side sides put out their cards every year or every other year,and we run out to buy them.Just for bragging rights......by the way 3770k@4.5ghz 1.2v with Sapphire 7970 Toxic in crossfire with the 3gb version 1200/1600.(ps.....Gtx 680 is chillin in the closet)

I see what you did there! ;)
 
Ryan is doing a great job at spreading the information but it isn't something new.

Techreport already made an article about it in 2011:
http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking

And a follow up about Nvidia's frame capture tools last month:
http://techreport.com/review/24553/inside-the-second-with-nvidia-frame-capture-tools

The issue I have with the other websites is they are only highlighting frame times. Don't get me wrong I'm not discounting their efforts but it's only been recently (and Ryan at the forefront) who has pointed out runt frames and dropped frames which is the major revelation using this technology.

IE - most of the data being put out by websites is very technical and quite confusing. Ryan has managed to explain it well and has done a lot of work exposing the issues with Crossfire which is really important.
 
I have found it really interesting reading this thread.
I'm wondering how much noticeable difference the incorrect framerates have been making to competitive multiplayer gamers?
While I'm not a competitive MP gamer, I've seen lots of talk about keeping the framerates up really high, to keep the competitive advantage. Are we saying that you'd be at a disadvantage, with a Xfired setup?
I haven't actually read the report about runt and dropped frames but I guess it also matters what sort of mark/space there is between good and rogue frames.
i.e. Good(card1)-Good(card2)-Runt(card1)-Runt(card2)-Good(card1) etc. or
Good(card1)-Good(card2)-Good(card1)-Runt(card2)-Good(card1)
Meaning, occasional duff frames are unlikely to give you a MP disadvantage?
*Lights the touch-paper and retires* :)
 
If you consider having a higher frame rate to be an advantage then yes, theoretically you're losing that advantage.

However I've never been a huge believer in the hype that having a higher frame count gives you a tactical advantage tbh.
 
However I've never been a huge believer in the hype that having a higher frame count gives you a tactical advantage tbh.

It is the same with all the stuff ppls buy to be "better".
Low latence mouse with 9001 dpi and stuff. The funniest thing i have read lately is the new "Killer" network chip on the MSI Gaming mainbords. They claime to have lower latence with the connection and stuff. I bet that nobody will ever be able to notice a 1 ms delay EVER. No matter how mlg pro 1337 skilled they are.

To give an example. 2 or 3 years ago, I had a standart pc which gave me about 80 fps in css, a MX518 Mouse, a 20 € standart keyboard and a far far too slow internet connection and still managed to play competetive esl css in the EPS. We won tournaments and shit. There is no need to spend the extra money on stuff to get the "bonus". Just get the skill and you will destroy everyone.
 
TBH "Killer" cards have been pointless for ages now. They used to work quite well and gain a few FPS here and there but since Windows 7 they've been pretty pointless. I recall reading about why and remember the 'stack' being mentioned?

So I guess their idea now is to work with motherboard OEMs and just put it on there. Obviously they're not making the money they used to because if we compare say the MSI G45 to the MSI G45 'gamer' or whatever it's called you only pay a £35 premium for both the Killer NIC chip and the premium sound card.

But yeah, it's amazing what kinds of snake oil products have fooled people over the years.
 
Back
Top