Quick News

What I found a worry in JZ's video was colorful saying they will send him a different card, does that mean they will remove all the ones like he had? (no). or they will send him a modded 1 and what you buy will not march what he reviews or more lightly they send him a more expensive SKU.
One of those times Asus tax could be value for money tax

Quite possible that all AIB will switch to using at least one quality array from here on. We know stock is empty, so they have minimal recalls/RMA to fulfill.

From the look of it, design wise, its an easy swap so the next flood of cards that hit the market may have this covered.
 
Quite possible that all AIB will switch to using at least one quality array from here on. We know stock is empty, so they have minimal recalls/RMA to fulfill.

From the look of it, design wise, its an easy swap so the next flood of cards that hit the market may have this covered.

As jz said they are advertised as 1710 clock so as long as your card can do that you have no grounds to RMA, they could send out a firmware that locks the Boost at 1790 and they've sorted the problem.
I think I'd go with JZ it's cheaper to fix firmware than do a retool, 1 of those cases you really are going to get what you pay for, cheap card low clock boost
 
To be fair this change wouldn't need a proper retool, this isn't a change in the PCB design itself, it'll be just a change in the SMD stencil (That makes sure the solder goes on the right bits) and obviously the few parts(You just reprogram the SMD assembly tools for different placement), essentially trivial costs for a new stencil for companies this big, and the BOM cost change isn't going to be more than a £5er for a handful more SMD caps.

I'd assume they'd be on it more or less as soon as the problem has been identified, it looks like many noticed it more or less as soon as they could, when they got access to full drivers, going from how some have responses already.

If they did send Jay a custom unit, it would be fairly easy to tell, hand soldering tiny SMD parts is basically impossible to do neatly, it's very easy to tell apart from a machine job, and if they've got a machine to do it they will have already done the work for retooling.
 
Last edited:
With so many AIBs supposedly facing issues, it seems more probable that there was a problem with Nvidia's specifications.

Combine that with a reportedly short timeframe for testing, and Nvidia was asking for this.

It looks like the RTX 30 series may have been pushed out the door. It makes sense, new consoles and RDNA 2 are on Nvidia's doorstep.
 
What I found a worry in JZ's video was colorful saying they will send him a different card, does that mean they will remove all the ones like he had? (no). or they will send him a modded 1 and what you buy will not march what he reviews or more lightly they send him a more expensive SKU.
One of those times Asus tax could be value for money tax

I don't buy Asus. Well, very very rarely. However, I also don't buy cheap entry level crap cards. Never have and never will. Same way I have never ran a cheap PSU and thus have never had one blow up.

The reality is going to be that the £649 card was a pipe dream. Which I also said many times before they actually launched. The revised ones with more on them are going to cost more.

I don't even blame the poor sods trying to earn a living out of this either. Had Nvidia not set them an impossible task this wouldn't have happened.
 
I don't buy Asus. Well, very very rarely. However, I also don't buy cheap entry level crap cards. Never have and never will. Same way I have never ran a cheap PSU and thus have never had one blow up.

The reality is going to be that the £649 card was a pipe dream. Which I also said many times before they actually launched. The revised ones with more on them are going to cost more.

I don't even blame the poor sods trying to earn a living out of this either. Had Nvidia not set them an impossible task this wouldn't have happened.


Not really, Nvidia will get more stock in and continue to have the 3080 at £649 on their own store, The issue is the AIB's cutting corners to increase their profit margins, Aftermarket cards being £699 is not an issue as long as they can perform, Have better cooling, Power delivery/filtering etc...
 
Not really, Nvidia will get more stock in and continue to have the 3080 at £649 on their own store, The issue is the AIB's cutting corners to increase their profit margins, Aftermarket cards being £699 is not an issue as long as they can perform, Have better cooling, Power delivery/filtering etc...

Nvidia? lol. They never intended it to be a £649 card. That was done purely because they know what is coming this year. Those Founders cards? there is no way they are making their 60% on those dude. Never, not in a million years. Let's wait and see how long it is before any old user can go and buy one at his leisure.

The AIBs are cutting corners because Nvidia sold a promise they could not meet. Every single one of them will be thinking "s**t, we need to try and make a card at that £650 so we can sell". Remember, it's an incredibly competitive market and only the big fish can charge lots (like Asus and EVGA) and get away with it. The smaller fry? like Colorful etc? have to make a profit.

I don't blame them. At all. They are only trying to survive (if you saw the figures like I have from my pal in Taiwan you would understand). I blame the poor node. The ridiculous power use and ETC.

Fact is this core needs tantalum caps. And they are expensive. Analogue caps are not going to cut it. The funny part is analogue caps worked perfectly well on Turing. Funny that isn't it?

Here's another chance for me to be wrong. Get your humble pie ready, as my neck is going out on the chopping block AGAIN.

The 3070 has been made solely based on the spec of the upcoming consoles. The raw TFLOPS of the XBSX that we know of are about the same as a 2080 Super. The console is set to cost $500 or £450 depending on where you live. How much is the 3070 "set to cost" again? you guessed it, about the same.

Firstly, you try getting one for that price and secondly, see if it performs about the same as a 2080 Super like the consoles are touted to.

In reality if you want a 3070 that works you are looking at £500+ and for a good one £600.

If I walked into a shop having seen a TV online for £400 and all of a sudden the sales rep told me the TV was £500.... What would you do? you would cuss him out and walk away in disgust right?

So, pray tell, why is it that Nvidia can pull this s**t day and night and people fall for it hook line and sinker?

If I ordered my shopping from Waitrose and they billed me 30% more on the day it arrives I would go ballistic. Yet Nvidia do this stuff 24/7 and people just bend over and take it.

I don't know what upsets me more. Their behaviour, or, seeing people I have a modicum of respect for falling for their BS.
 
Not really, Nvidia will get more stock in and continue to have the 3080 at £649 on their own store, The issue is the AIB's cutting corners to increase their profit margins, Aftermarket cards being £699 is not an issue as long as they can perform, Have better cooling, Power delivery/filtering etc...

As I mentioned, so far the AIB are not pricing out of control in Norway. Taking into account the poor performance of the Kroner, its realistically in the realms of the MRRP. But lets see what supply and demand does.

I see people selling their 3080 on finn.no (craigs list equiv) for about £400 more :S and they are posting names etc to prove they have them in possession.
 
`It will be interesting to see where people stand who bought off Ebay if they do a recall, though I'm still betting on a firmware boost limit even on new cards
 
OK so I just watched the video.

Basically then they are skimping on power delivery. That's about the upshot of it.

However, they did that on Turing and it came at no real cost. I mean sure, my single 8 pin 2070 was locked at 1950mhz in the OC software, but that was because of the single 8 pin and so on. However, it didn't crash.

A few years ago a mate of mine who lives in Taiwan and worked at the time as a journo back to the UK sent me some profit and loss reports. MSI and Gigabyte were in big trouble at the time, and he predicted that a good few of them would go under. Mostly because they were making so little on Intel boards and stuff like that and the competition so stiff that they really had to work to margins that were so tight it just wasn't feasible for them to stay in business.

A lot of this was down to brand power. You know? that Asus tax that Asus can charge and get away with because of the reputation they have built, yet companies like MSI don't have that reputation and thus have to make cheaper boards. Cheaper in every way.

EVGA are another company that can enjoy this relative safety. Again because they have built a reputation but mainly just down to simple brand loyalty because they are American and Americans like buying stuff designed by Americans. EVGA is pretty tiny in the UK by comparison.

Smaller companies? yeah they are going to struggle quite badly.

`It will be interesting to see where people stand who bought off Ebay if they do a recall, though I'm still betting on a firmware boost limit even on new cards

Now that it seems to be more acceptable to be a little bit negative about Ampere (mainly thanks to Steve @ GN and a couple of others) I will once again talk about it.

Cheaper cards are going to have to be limited yes. You can't double the cost of the components and still sell the cards for the same prices. It just doesn't work like that. My Dual OC was cheap, and it was cheap for a reason. It had one 8 pin, and was limited to a hard 1950mhz in OC tools by the firmware. Bypassing it would have been a bad idea, as it would have likely cooked itself.

kFk47nQ.jpg


But that was OK because I knew that before I bought it. I could see the single 8 pin. I also knew that it may have analogue caps, but again that was fine. Turned out it did.

So the reality is? any of the cheaper 3080 cards will not be able to boost at their own leisure, and you will see lower clocks, and you will see much less raw performance out of those cards. Meaning every review you watched so far? means jack.

If you want that extra performance? prepare to pay extra for it. That is nothing new. However, understanding that before you blindly order a card is incredibly important.

Ampere as a design? fantastic. Ampere as a technology? brilliant. Ampere in reality because of Samsung?

I won't need to repeat what I have been saying for ages.

BTW I would just like to reiterate that I was no more negative about Turing than any other guy. I actually defended the pricing, because I knew it was expensive and knew it would be long before launch (IE putting loads back onto the die would come at a cost).

And, as much as I don't like Nvidia I still run three Turing cards. Which speaks for itself. So don't tell me I am being overly negative because I don't like Nvidia. I am being overly negative because Ampere is overly negative on Samsung. It's as simple as that and if you don't understand that? try doing some learning and reading before you get so mad at me.
 
Last edited:
Meaning every review you watched so far? means jack.

If you want that extra performance? prepare to pay extra for it. That is nothing new. However, understanding that before you blindly order a card is incredibly important.

Ampere as a design? fantastic. Ampere as a technology? brilliant. Ampere in reality because of Samsung?

I won't need to repeat what I have been saying for ages.

BTW I would just like to reiterate that I was no more negative about Turing than any other guy. I actually defended the pricing, because I knew it was expensive and knew it would be long before launch (IE putting loads back onto the die would come at a cost).

And, as much as I don't like Nvidia I still run three Turing cards. Which speaks for itself. So don't tell me I am being overly negative because I don't like Nvidia. I am being overly negative because Ampere is overly negative on Samsung. It's as simple as that and if you don't understand that? try doing some learning and reading before you get so mad at me.

Interestingly Tom put bad OC down to early drivers now maybe it wasn't because of that only.

Possibly a tinfoil hat situation, they knew and didn't want reviewers pushing till crash
 
Interestingly Tom put bad OC down to early drivers now maybe it wasn't because of that only.

Possibly a tinfoil hat situation, they knew and didn't want reviewers pushing till crash

It's pretty much a given that when you do a successful shrink you gain lower power consumption which usually leads to the ability to gain perf per watt. Like I say, that is a given and it is why every one shrinks. EDIT IN. Note I said a successful shrink. Not a Broadwell failure shrink.

Intel have not been able to achieve this. I mean, they have shrunk (it's a proven fact) but every time they do it's a failure.

https://www.scan.co.uk/products/int...ghz-turbo-64-gt-s-dmi-1100mhz-gpu-31x-ratio-6

There was their last attempt at a proper and true shrink. That chip barely reached 4ghz. Now when you consider that Devil's Canyon on Haswell reached 4.9 and 5 if you were lucky and or delidded? that is a failure of epic proportions.

In reality 8nm Samsung is very much like 10nm TSMC. IE, it has the same transistor density and so on. However, unlike TSMC there has been no clock gain *at all*.

In fact I could sit here and argue and say that there has actually been a huge clock drop. What I mean is what I would like to see next is a 3080 limited to exactly the same TDP as a 2080Ti (and not given 60w extra or whatever it is) and then see how it performs.

Quite probably relatively awful.

And see this is the stuff I have been saying before the cards even launched. Quite literally because it was true, and remains so.

LN2 and unlocked voltages tell the story here. Ampere simply does not clock as high as Turing. And that is because of a poor node.

That doesn't mean all Ampere cards are going to be terrible and failures etc. A well made card is still faster than a 2080Ti and costs less which is good, but a realistically priced one will come with severe caveats.

Caveats that people should have known about.

If Ampere had this year all to itself? fine. Only it doesn't. It has three big competitors.

So not only do we have a poor node we have a rushed release, loads of hype and basically utter crap and everything else. And people think I am just being negative? I can only be negative when there is negativity to draw upon.
 
Last edited:
BTW I will add this. My buddy got one of the crap phases Gigabyte cards and he has no issues with it at all.

I am beginning to wonder if it's not so much the phases but just terrible quality low bin dies from Samsung to blame.

He's been clocking balls on it all day (to make sure he didn't get burned) and it's been fine. It's one of the ones Jay took apart with the cheaper phases too.
 
Not quite sure what you mean by an 'analogue' capacitor? A capacitor is inherently analogue. Also if these cards are running so close to the limit that the bypass filter is causing issues then there are probably deeper problems.
 
Not quite sure what you mean by an 'analogue' capacitor? A capacitor is inherently analogue. Also if these cards are running so close to the limit that the bypass filter is causing issues then there are probably deeper problems.

I mean analogue by round stand up. I'm old fashioned :D

And yeah, I don't think this has anything to do with poor power delivery at all. Like I said I have seen plenty of Turing cards with savagely cheaper power delivery systems on and they got along fine.

The fact my mate's one clocks tits says it all to me. These are just poor dies. Which was totally expected as we knew Nvidia would keep the good ones and pass the rest on to OEMs. So you either buy a FE to make sure it can hold its boost clocks or you play the lottery on a AIB card.
 
Bad power filtering won't be much of an issue if you have very little ripple in the PSU, maybe that explains your mates scenario, and while MLCC caps are cheaper per unit, you need a lot more of them, and you need more expensive ones if you want ones more stable in temperature. I wouldn't say tantalum caps are really the more reliable option for this use case, their ESR is terrible, while modern automative grade X8 class MLCC caps are practically as stable temp wise for cases like this. I don't think there's any doubt using tantalum caps here is to cut down component count, therefore reducing assembly cost and possible points of failure.

But, the goal of these designers is to create something that works properly for the lowest cost possible, and without them having a way to check what the end behaviour was like they would have had to guess where the limits were, and they very slightly missed.
 
Last edited:
The FE has all MLCC caps on. Other companies actually tried putting better ones on, and that is why they are crashing.

And yeah, maybe that is a part of the reason but I bet this problem did not exist on Turing.

The more the days pass the more it just smells like Vega. Clocked to balls on launch to make up for the fact it is pretty poor. Like I said, I would like to see a 3080 running at the same power pegging as a 2080Ti and see how it performs.

It seems all Nvidia got out of this shrink was just cramming more onto a smaller die. No other improvements seem to be there at all. The clocks are around the same as the 2080Ti (about 2ghz) and the power consumption is crazy. If they are sensitive to power filtering then that's just another issue to add.

Oh well. I guess people can simply under clock their cards and wait for Hopper.
 
Not at all convinced the tantalum caps were meant as an upgrade. Their properties are objectively worse for this use case, MLCC significantly outperforms tantalum caps for high frequency filtering, and they're in a different league when it comes to lowest ESR and impedance values, we're decades past the age of MLCC usually being the cheap and dirty option.
 
Last edited:
Not at all convinced the tantalum caps were meant as an upgrade. Their properties are objectively worse for this use case, MLCC significantly outperforms tantalum caps for high frequency filtering, and they're in a different league when it comes to lowest ESR and impedance values, we're decades past the age of MLCC usually being the cheap and dirty option.

Yup leading me to think that they may have used a mixture based on what they had to hand.

In fairness to Nvidia here if they sent them a design and they buggered with it then that's their lookout and will be their RMAs.

BTW I also agree with Jay for once. If the card does the boost clocks it states on the box? you are screwed. You can either return the card or just wait for a gimped firmware, but your RMA could be rejected. I know with DSR here in the UK many will abuse that and BIN cards themselves but I can't see them pulling that in the USA where they have restocking fees still.
 
Back
Top