I haven't understood the whole minimum and especially the 0.1% lows, to me that just looks like Chinese or Russian to be honest. Have no idea what it is I'm actually looking at, hence mostly just looked at the average FPS, when it's most likely the least interesting to me in all honesty.
The average is the part you should ignore. It's just an average, of the maximum and minimum FPS. It is that difference (between max and min) you should be looking at. It is what defines a smooth gaming experience. A 50 FPS frame drop is what you will see. Steady wins the race, so the average results are to be ignored.
For example let's say I have a max FPS of 230 in PUBG. My minimum is say, 90. Believe me if it swings between the two you are going to see it, no matter how good adaptive sync technology is. The average is generated by a calculator, and means nothing at all. Because it is a static number they have worked out from knowing what the min and max were.
Oh and 1% and .1% lows have only been a thing for about 10 years. That is how AMD managed to rip so many people off !. When you ran Crossfire you could only generate so much information with the tools you had. Which at that time was FRAPS. The problem? well it could read the minimum FPS (so could I if I had a counter and stared at it) and the maximum. Again, you could see that. The issue was what it was hiding. IE - frame time. Or, the time it takes to deliver that frame. And whether it was a whole frame or a partial frame (IE with some of it dropped) or a runt frame (which was counted but you didn't even get to see).
It was only when Ryan at PCPER (who now works in Intel's GPU department) used a tool called FCAT that basically outed AMD. At which point AMD were forced to you know? make Crossfire actually work on your actual computer without being total chit. And they did, but the maximum FPS, REAL minimums and the new data showed it in a much poorer light. IE, those "fake" runt frames it was delivering were fooling people. And when it delivered a partial frame (or a bad 1% low?) you would see a stutter absolutely every single time. Yet people refused to believe it at first, and there was a bit of a witch hunt.
I had used SLi for some time at that stage. And I went over to Crossfire and I absolutely hated it. That was back when people could use the excuse of "No stutters here !" and "It must just be you" and so on. Because there was no scientific data to back it up. And usually multiple GPU systems were being ran by posers who TBH? were probably running benchmarks all day so never even noticed the god awful terrible mess it was. And on enthusiast forums? it was far more of them than people who had a rotten gaming experience.
It never EVER happened in SLi. Because Nvidia actually spent a lot of money on that.
Oh and BTW dude, the minimum FPS is the most critical part. And the difference between it and the maximum. The higher the minimum? the less chance you will feel a change when the FPS drops. So the higher you can maintain the minimum the less noticeable it becomes when it drops suddenly. Like, imagine standing all of the way up straight, and then dropping your body by 50 percent, compared to just ducking down a tiny bit. The more it is the more you are going to feel it.