I wasn't looking at the scores I was more interested in the clock drop and higher power consumption.
I still don't understand exactly why a card with 1710mhz on the box is boosting 250mhz higher though. Unless someone somewhere was trying to inflate early review scores?
Either way the fix is what I expected. If these cards are going to boost they are going to have 5600XT issues IE not all of them will do the full boost.
I'm kinda on the fence about this whole automatic overclocking thing any way. I see the good points from both sides, mainly those who haven't got a clue what they are doing are leaving performance on the table and those who complain they would rather do it themselves. I think both sides have valid points on that.
However, it's not a good idea to launch a card that you then have to derp afterward by even 1% to make it work properly. They should boost to 1710 in that case, all be stable, look a little worse in benchmarks and just be released for what they are. Now they are altering the TGP of the card to make it stable at a clock it apparently shouldn't really be at.
It all reminds me of Vega. A lot. Where people had to manually undervolt their cards to stop them cooking themselves. I was an opponent of that too, and agreed with sentiment that the card should have been released for what it was (IE a slower card that didn't behave terribly) and then let people overclock it and etc.
https://wccftech.com/nvidia-rtx-30-...e-curious-case-of-missing-gaming-performance/
A very interesting article, which could explain why Ampere seems so poor for what it is. It may also explain the terrible lack of performance scaling between the 3080 and 3090.
Hopefully it will improve.