Quick News

Interestingly Tom put bad OC down to early drivers now maybe it wasn't because of that only.

Possibly a tinfoil hat situation, they knew and didn't want reviewers pushing till crash

I can tell you that at least one AIB told Tom that any instability was down to drivers.

TBH, it will take a while for the full truth of this situation to come to light.
 
I can tell you that at least one AIB told Tom that any instability was down to drivers.

TBH, it will take a while for the full truth of this situation to come to light.

I don't think there is much more info needed. The cards do what the box states perfectly. People are installing tools to overclock the cards and then whining because they crash.

People can blame power filtering and etc all they like, but tbh if it were me I would not allow a return. Even though Jay was totally putting the blame in the wrong place he was correct. If your card can boost to 1700 odd or whatever the box states then you have your card.

So in this instance? I will defend Nvidia totally. Bottom line is that "under clocking" the card fixes the issue and thus so long as the card meets its stated requirements then there is nothing you can do about it.

I say this as a "unlucky lotto user". IE, I have had some real turds in the past. I bought a brand new 670 Palit Jetstream and it would not even do 50mhz over the stated clocks. I didn't get mad, I just removed the overclock.

But many will abuse this situation and send cards back for a "Free bin" which IMO is total bull. IE they will abuse DSR in the UK to get a decent card.

Once again it comes down to expectations. People expect too much.

If you bought a 3080 for overclocking in the first place you were an idiot.

BTW if those who bought a 3080 and it won't hold 1900+mhz feel hard done by they would have been welcome to try the pile of crap Vega 64 I had. It wouldn't even do stock boost without pink screening.
 
Last edited:
I can tell you that at least one AIB told Tom that any instability was down to drivers.

TBH, it will take a while for the full truth of this situation to come to light.

If it seemed I was criticizing Tom than I apologize, it wasn't meant that way, he can only go with what he's given and what he's told is the issue.

My only point was that it may not have been a drive only issue
 
If it seemed I was criticizing Tom than I apologize, it wasn't meant that way, he can only go with what he's given and what he's told is the issue.

My only point was that it may not have been a drive only issue

I don't blame Tom one bit either. I totally believe what he was told.

It is what it is. People expecting huge overclocks are asking for too much. That's always been my standpoint. If I want a good product I don't mind paying for one (see also 2150 pretty much guaranteed Kingpin card). Mostly because I CBA buying cards, sending them back like a d1ck for no reason and then buying more to find the best clocker.

Overclocking was never EVER guaranteed. Ever. People are just getting totally unrealistic now.

I did raise a decent point earlier about my 2070. I knew it would be limited in clocks and I was not surprised. I didn't start whining or send it back though.
 
OK so having spent the past few hours reading into this and checking out posts here is what I found.

Somehow for some reason cards are boosting much higher than they should, and are crashing. There have been quite a few threads on OCUK about it now. One guy has a Ventus, and apparently it was clocking itself to 2050 and crashing. It should not be doing that with nothing installed.

TBH that is a very high clock for Ampere, and thus the card is not working as it should. It should, by rights, be clocking to around 1700 boost without being forced but it is not. In the same thread the fix was simple. He simply stopped it boosting by 50mhz (so 2ghz) and the card was perfectly stable and benched even higher than it should have. IE, it was likely over boosting and running into power issues too and thus down clocking under those circs.

This is a BIOS issue no doubt. I would imagine they used the FE bios on their cards, and for some reason it is acting oddly.
 
If it seemed I was criticizing Tom than I apologize, it wasn't meant that way, he can only go with what he's given and what he's told is the issue.

My only point was that it may not have been a drive only issue

I wasn't reading your posts as something against Tom. I'm just pointing out that AIBs are just as confused as consumers are in a lot of cases.

Nobody would knowing release faulty products if they could avoid it.
 
I wasn't reading your posts as something against Tom. I'm just pointing out that AIBs are just as confused as consumers are in a lot of cases.

Nobody would knowing release faulty products if they could avoid it.

I don't think anything is faulty fella, apart from maybe the firmware on the cards.

See at first I thought these cards were being overclocked but it seems they are doing this without any power limit or thermal limit raised. IE, they are overclocking themselves into instability.

Once people reduce the clocks? the problem goes away.

Now I am a little suspicious about this I will admit. Was this deliberate by Nvidia to make the 3080 look as good as possible?

OK look before I upset any one let me explain that. Remember those boards that were enabling PBO etc by default? (Asus? IIRC?) and thus were putting out higher numbers than other boards making them look better?

I hope that is not what Nvidia have done to make the 3080 faster than it is capable of being. IE they advertise a 1710 boost clock, every one runs them "stock" but really they are boosting over 200mhz over that amount. So the reviews go out, then you derp the cards by 200mhz.

I really hope that isn't what is going on and this is a genuine error. It's not about capacitors or anything though I am convinced of that, because a Ventus should not be hitting 2050mhz "stock" and crashing.
 
Power issue or driver/firmware issue, it's not bothering me personally because I skipping these card's unless I have a failure in another pc, either of my machines, my dad's machine, my grandads machine or my plex server.


For those people who have them, I think eventually it will be resolved but how I am not sure.


This really does remind me of the launch of the 2000 series card's.
 
Not at all convinced the tantalum caps were meant as an upgrade. Their properties are objectively worse for this use case, MLCC significantly outperforms tantalum caps for high frequency filtering, and they're in a different league when it comes to lowest ESR and impedance values, we're decades past the age of MLCC usually being the cheap and dirty option.
The only thing that really ever stood out as being beneficial about tantalum capacitors was their capacitance vs their size and their durability when compared with traditional electrolytics. Their ESR isn't all that bad but is never going to compare favourably to an array of capacitors in parallel. The Capacitance adds and the ESR divides (1/R technically) = win/win. 1x tantalum @ 0.125R vs 10x MLCC @ 0.005R= 0.0005R, quite the difference though difficult to say how that plays its part in all of this.

Based on the values shown, it's a bit baffling that some cards have tantalums with values of 220uF, some have 330uF and others have 470uF and some have a mix of values.

Cost wise, say the 470uF tantalum costs about £0.95 per unit for a full reel of 1000 and you are using 6 per card. Going all out using MLCC parts that cost £0.20 per unit for 4000 and use 60 per card.

MLCC @ 60x0.2 = £12 per card
Tantalum @ 6x0.95 = £5.70 per card

Obviously a manufacturer will be dealing in vastly larger numbers and possibly directly with makers of the parts which will further reduce those costs. There is also the pick and place time factor as 60 parts vs 6 will take longer, again, costing more money.
 
nwkqGHl.png


3080 is fixed in the new driver apparently. Clocks reduced and power increased by 10w.
 
Those scores are pretty much within margin of error for that test, I wouldn't draw conclusions on that. But I wouldn't be surprised if they just enforced the ~-50MHz clock speed on a driver level, since so many reported that underclocking fixed the issue.
 
I wasn't looking at the scores I was more interested in the clock drop and higher power consumption.

I still don't understand exactly why a card with 1710mhz on the box is boosting 250mhz higher though. Unless someone somewhere was trying to inflate early review scores?

Either way the fix is what I expected. If these cards are going to boost they are going to have 5600XT issues IE not all of them will do the full boost.

I'm kinda on the fence about this whole automatic overclocking thing any way. I see the good points from both sides, mainly those who haven't got a clue what they are doing are leaving performance on the table and those who complain they would rather do it themselves. I think both sides have valid points on that.

However, it's not a good idea to launch a card that you then have to derp afterward by even 1% to make it work properly. They should boost to 1710 in that case, all be stable, look a little worse in benchmarks and just be released for what they are. Now they are altering the TGP of the card to make it stable at a clock it apparently shouldn't really be at.

It all reminds me of Vega. A lot. Where people had to manually undervolt their cards to stop them cooking themselves. I was an opponent of that too, and agreed with sentiment that the card should have been released for what it was (IE a slower card that didn't behave terribly) and then let people overclock it and etc.

https://wccftech.com/nvidia-rtx-30-...e-curious-case-of-missing-gaming-performance/

A very interesting article, which could explain why Ampere seems so poor for what it is. It may also explain the terrible lack of performance scaling between the 3080 and 3090.

Hopefully it will improve.
 
Last edited:
I'm mainly looking at the score since it's the most controlled metric, as the methodology is dictated by Futuremark. Who knows what kind of errors the other measurements have.

The clock values fluctuate wildly with case temperature, and wattage is also affected by environmental factors. Not to say that score isn't, but less so than the others. I wouldn't read too much into those results before someone does more robust testing. It's entirely possible that they just raised voltage and dropped clocks, but it also seems that people are picking tweets they want to read and then propagate that information.
 
As I said I don't care about the results of the performance metrics. I really don't. And that is being fair. Like when the whole power gate thing happened I was soon pretty convinced there was more to it than "bad" phases as Jay said.

It's simply that the cores are at the limit and thus some of them would crash when being stuffed to nearly 2ghz when the box says 1700 odd mhz.

And they are still boosting to 1900 odd even after the fix. So yeah, like with most of this launch I am more interested in the actual tech and science rather than gaming results.

I would have thought the fix should be to make the card more honest, not shove more voltage into it. Because unless they increase the TGP by declaration and on the boxes they are going to get unstuck like they did with the 3.5gb gate.
 
I wasn't looking at the scores I was more interested in the clock drop and higher power consumption.

I still don't understand exactly why a card with 1710mhz on the box is boosting 250mhz higher though. Unless someone somewhere was trying to inflate early review scores?

Either way the fix is what I expected. If these cards are going to boost they are going to have 5600XT issues IE not all of them will do the full boost.

I'm kinda on the fence about this whole automatic overclocking thing any way. I see the good points from both sides, mainly those who haven't got a clue what they are doing are leaving performance on the table and those who complain they would rather do it themselves. I think both sides have valid points on that.

However, it's not a good idea to launch a card that you then have to derp afterward by even 1% to make it work properly. They should boost to 1710 in that case, all be stable, look a little worse in benchmarks and just be released for what they are. Now they are altering the TGP of the card to make it stable at a clock it apparently shouldn't really be at.

It all reminds me of Vega. A lot. Where people had to manually undervolt their cards to stop them cooking themselves. I was an opponent of that too, and agreed with sentiment that the card should have been released for what it was (IE a slower card that didn't behave terribly) and then let people overclock it and etc.

https://wccftech.com/nvidia-rtx-30-...e-curious-case-of-missing-gaming-performance/

A very interesting article, which could explain why Ampere seems so poor for what it is. It may also explain the terrible lack of performance scaling between the 3080 and 3090.

Hopefully it will improve.




They are boosting above 1710Mhz because of GPU Boost, all card's do it from the 900 series to the 3000 series.


The 2080Ti you have will do exactly the same thing, if there is spare power and temp headroom then it will boost.


It's got nothing to do with reviews, it's all about the card boosting to what it's capable of, otherwise it would just be manual clocks and those would reach around the same mark anyway.
 
They are boosting above 1710Mhz because of GPU Boost, all card's do it from the 900 series to the 3000 series.


The 2080Ti you have will do exactly the same thing, if there is spare power and temp headroom then it will boost.


It's got nothing to do with reviews, it's all about the card boosting to what it's capable of, otherwise it would just be manual clocks and those would reach around the same mark anyway.

No my 2080Ti doesn't do the same thing. Not unless I install something like EVGA Precision, run a boost scan, get it to apply the overclock and then tell it to start with Windows. Without that? it boosts to factory boost clocks of 1770mhz.

Well, the last time I did a clean install it was like that any way. It seems that has been added to the drivers at some point.

It didn't happen on my Titan XP either. I just let that thing loose on the air cooler and it would boost short term to 1960 odd then back down to 1700 odd when the card got too hot and the fans reached 80%. That was why I started water cooling my GPUs.

I think you would find it would have a lot to do with reviews. If it only hit 1710mhz with no overclocking or anything to inflate the clock speed it would have been much slower. Maybe 20% tops over the 2080Ti. Instead it is up to 35% and now we know why.

Edit. To clarify that. If the "guaranteed or RMA" boost clock is 1710 then that is what the cards should be reviewed at in case you get a dog like I have in the past. A review should be based on what every single card is capable of, not a cherry picked few.

The fact Jay got one of these cards and they told him not to review it and wait for a replacement? yeah that's pretty shoddy. "No just wait for us to send you one that can boost".
 
Last edited:
This is quick news. Just start a thread about it. This isn't the place for lengthy discussion guys. Makes it harder to sift through posts to find something interesting.
 
Adrenalin 2020 Edition 20.9.2

https://www.amd.com/en/support/grap...d-radeon-rx-5700-series/amd-radeon-rx-5700-xt

Support For


  • STAR WARS™: Squadrons :rock:
Added Vulkan™ Support


  • VK_KHR_buffer_device_address
    • This extension is used to query the device address of a buffer to allow for shader access to that buffer’s storage via the SPV_KHR_physical_storage_buffer SPIRV extension.
  • VK_EXT_robustness2
    • This extension provides stricter restrictions for handling reads and writes that are out of bounds. It specifies that out-of-bounds reads must return zeros and out-of-bounds writes must be discarded. This extension also adds support for null descriptors.
  • VK_EXT_shader_image_atomic_int64
    • This extension extends existing 64-bit integer atomic support to images, which provides more efficient access than buffers. This allows applications to quickly improve their performance with minor changes to their code.
Fixed Issues


  • Some games may exhibit stutter intermittently during gameplay on Radeon RX 5000 series graphics products.
  • Radeon FreeSync may fail to enable after updating Radeon Software without a system reboot.
  • Screen flickering may be observed while MSI Afterburner™ is running or enabled on the system.
  • X-Plane 11™ may experience an application hang or crash when using the Vulkan® API.
  • DOOM™ VFR may experience corruption or artifacting in game on Radeon RX 5000 series system configurations.
  • Performance metrics overlay may fail to open or appear after the system wakes from sleep.
  • Call of Duty®: WWII may experience black textures on the ground or walls in zombies game mode.
  • Blocky corruption may be observed in Detroit: Become Human™ on some Radeon RX 5000 series graphics products.
  • Using the Movies&TV application to edit video clips may result in green corruption in the clips.
  • Performance metrics may report incorrect values for current VRAM usage after an extended period of gameplay.
  • With HDR enabled, Windows® desktop may experience flickering, and performing a task switch while in a game may cause colors to become washed out or over saturated.
  • World of Warcraft™ may experience corruption issues with anti-aliasing enabled on DirectX®12 API.
  • Launching Radeon Software after a driver upgrade, may cause the Auto OC dialogue to appear with “0 Mhz” when the Auto OC feature has been previously enabled on Radeon RX Vega series graphics products.
Known Issues


  • Enhanced Sync may cause a black screen to occur when enabled on some games and system configurations. Any users who may be experiencing issues with Enhanced Sync enabled should disable it as a temporary workaround.
  • Performance Metrics Overlay and the Performance Tuning tab incorrectly report higher than expected idle clock speeds on Radeon RX 5700 series graphics products. Performance and power consumption are not impacted by this incorrect reporting.
  • Audio may experience instability when connected through an Audio Video Receiver via HDMI® on Radeon RX 5000 series graphics products.
  • Modifying the HDMI Scaling slider may cause FPS to become locked to 30.


The fact that the known issues section is getting smaller and smaller each release is great, I hope this is a good sign of the drivers for the 6000 series.
 
Back
Top