GPU pricing is expected to drop in July

In January he was saying that 1080ti was EOL and that there wasnt a shortage, Nvidia just stopped making them because no one was buying them. oh, and Nvidia could instantly ramp up production anytime they wanted, and that there wasnt a RAM shortage affecting GPU supply. Also that you couldn't buy a 1080ti anywhere, unless it was drastically marked up. Even though I pointed out that I had just bought one very nearly at MSRP direct from EVGA with no difficulty. Pretty sure he also said "his guy" wasn't connected anymore as well at some point.

He doesn't respond to my posts anymore, so it may very well be me he was referring to. If so, that's a net win, in my book.

Are you talking to me? I assume you are.

Firstly the EOL thing - many were. That is exactly how they were listed by Scan and others. For example (from memory) the Seahawk or whatever it was called. No stock, no expected stock, nothing on the horizon.

Moving on to the Nvidia and ramping up production etc. They can do that any time they want to. The very fact that they did and have and now there are too many cards shows that that is exactly what they did. I can't see how they made these cards without any VRAM too, so you'll have to forgive me on that one. Were Nvidia hesitant? yes ! did they think long and hard before doing so? absolutely. It took them around nine months to start. And, as it would happen, it has backfired in exactly the way that I said it would. Now there are floods of cards no one is buying and they have a new launch on the horizon that, IMO, they delayed so they could make more Pascal cards to sell to miners. Had there been no miners they would have release "Ampere" or whatever they are calling it this week *ages* ago. Why? because they would have actually needed us, PC gamers, to buy their sh*t. Instead they just focused on a cash grab from miners and we've all been given a back seat.

As for what you paid for your EVGA card? irrelevant. Well, to Asus any way. However, I suspect EVGA played it even more safely that Asus, because they are smaller (much smaller in comparison). Also you say "Direct from EVGA". Which is completely different to buying it from some one like Scan, or OCUK, or any one else. Nvidia had retail priced cards on their site... For about ten seconds at a time. Then you couldn't get them again for months.

As for me ignoring you? I probably did, but certainly not with the ignore function. I do have control over my hands, you know. Fact is I relay on information for others to read, not sit and pick the peanuts out of every small lump of crap about it. I just cba doing that so if you disagree? fine. That is up to you. Whether you will get a reaction from me? not usually, because I really don't care what you think. I am not being rude saying that, I genuinely couldn't give a crap. If I did I would have stopped using the internet in 1999.

Added. I never hid who the source of my information was. The only reason I never linked to it is because he was affiliated in a big way with a website Tom doesn't like. I don't know the reasons, and tbh that is Tom's thing but out of respect I never posted links here. Even if I had? chances are they would have been deleted and Tom asked me not to nearly a decade ago so I have no wish to defy him just to pee him off.

https://twitter.com/Bindibadgi?ref_src=twsrc^google|twcamp^serp|twgr^author

There you go, read, follow, do what you like. He no longer works for said website, or even has a forum account on said website (he's had it deleted) so there you go. He has now gone completely freelance, doing the same thing he always did with the friends and contacts he has had for years. He still lives in Taipei.

Addendum.

Looking back I suspect there were a few things going on and a few factors at play. This is hindsight, btw.

I have a feeling Nvidia were all set to release "Ampere" or Volta back in March April. Why? well, why else would they have "leaked" the name of it? and some of the spec etc? (like what sort of GDDR it was using?). Trust me when I say nothing, but nothing gets out of Nvidia "unwanted". IE - these exciting "leaks" you hear about are usually "leaked" by the company who wants you to know about them. So you may find that Nvidia were planning to ditch Pascal and release something else all along, ages ago.

That goes some ways to explaining exactly why Scan not only had no cards but were listing many as "EOL" even though some of them showed up again a few months later. Why would you shoot yourself in the foot in that way as a business. IE - you tell all of your customers who come to your site that you don't have what they are looking for period, end of story. So there is a high chance they won't come back looking for it again.

What really happened? we will never know. One thing I have learned about PC component manufacturers is that they will do what they want to do and when, then make up a BS back story to explain themselves. So they could release Volta tomorrow, then come up with some BS story about that was how they planned it all along.

Going by history? nothing about Pascal, Maxwell ETC was "leaked" until it was about to launch. Then it launched, usually around Spring/early summer. This year that did not happen, however we got the "leak" right on time. They just didn't follow up with anything, aside from comments about how they love PC gamers etc.

Yeah, right.
 
Last edited:
Alien, but even if the mining demand didn't spring up, Nvidia still would have sold loads of Pascal cards and thus had less incentive to release the next generation. People were angry with mining because they couldn't buy any cards—Pascal cards, not next gen. But it sounds like you're suggesting Nvidia would have released next gen if the mining boom didn't occur. If that's true, why were so many angry they couldn't buy GPUs? They wanted Pascal GPUs. That tells me Pascal would have stuck around (because of Vega) irrelevant of the mining craze. I think the delay until later this year is to get rid of excess stock, but I don't think Nvidia would have released Touring/Ampere when Volta was released. I don't see any reason to do that other than to keep the image of 'top dog' present with investors and enthusiasts. GDDR6 is still in its early production phases. To rush out a new generation when there is no competitive reason to, it could be foolish.
 
Alien, but even if the mining demand didn't spring up, Nvidia still would have sold loads of Pascal cards and thus had less incentive to release the next generation. People were angry with mining because they couldn't buy any cards—Pascal cards, not next gen. But it sounds like you're suggesting Nvidia would have released next gen if the mining boom didn't occur. If that's true, why were so many angry they couldn't buy GPUs? They wanted Pascal GPUs. That tells me Pascal would have stuck around (because of Vega) irrelevant of the mining craze. I think the delay until later this year is to get rid of excess stock, but I don't think Nvidia would have released Touring/Ampere when Volta was released. I don't see any reason to do that other than to keep the image of 'top dog' present with investors and enthusiasts. GDDR6 is still in its early production phases. To rush out a new generation when there is no competitive reason to, it could be foolish.

It's not stopped them before dude. They've been ahead for years, it's not a recent thing. Everything they have made since the 780 Ti has been ahead. Intel were ahead for years too, that didn't stop them. They need to push on or investors would be cautious.

Any way, I am not responsible for something I didn't even say in the first place (shooting messengers and all that) but, Rich has not been wrong in the decade I have known him.

Any way, I am glad this second round of mining happened. It's shown me I was totally wasting my cash. It wasn't just the ridiculous GPU price increase (as I jumped in about two weeks before it exploded) it's the whole gaming thing in general. PC gaming has stopped in its tracks since the first few DX12 titles (that looked barely any better than the DX11 titles.. Then what? DX12 has been a complete let down). And you can add the RAM prices I didn't pay to that, and the SSD/NAND prices I didn't pay either.

I currently run 16gb of 2133 RAM. I can't tell you how many times I had a good chunk of cash to dispose of, yet 32gb faster RAM would cost me around £350. LOL are they srs?

I'm glad. This final and absolutely massive slap in the face has done me a right favour. I paid £350 for a console that is every bit as fun to use as my £godknowshowmuch PC.

Big games this year? won't make my XP sweat. At all, as we have already had games using the same bloody engines. So we know what to expect for the next entire year, all apart from Metro which I will see to believe.
 
It's not stopped them before dude. They've been ahead for years, it's not a recent thing. Everything they have made since the 780 Ti has been ahead. Intel were ahead for years too, that didn't stop them. They need to push on or investors would be cautious.

Any way, I am not responsible for something I didn't even say in the first place (shooting messengers and all that) but, Rich has not been wrong in the decade I have known him.

Any way, I am glad this second round of mining happened. It's shown me I was totally wasting my cash. It wasn't just the ridiculous GPU price increase (as I jumped in about two weeks before it exploded) it's the whole gaming thing in general. PC gaming has stopped in its tracks since the first few DX12 titles (that looked barely any better than the DX11 titles.. Then what? DX12 has been a complete let down). And you can add the RAM prices I didn't pay to that, and the SSD/NAND prices I didn't pay either.

I currently run 16gb of 2133 RAM. I can't tell you how many times I had a good chunk of cash to dispose of, yet 32gb faster RAM would cost me around £350. LOL are they srs?

I'm glad. This final and absolutely massive slap in the face has done me a right favour. I paid £350 for a console that is every bit as fun to use as my £godknowshowmuch PC.

Big games this year? won't make my XP sweat. At all, as we have already had games using the same bloody engines. So we know what to expect for the next entire year, all apart from Metro which I will see to believe.

Oh, no, it's not in relation to that chap who questioned you. It's more that I was curious about a point you made. No harm meant at all. :)
 
DX12 has no bearing on visual quality of games. Its just for performance. Visual quality is up to devs.

Yup and as I have found out it is quite severely limited because they are developing for the consoles. So any shiny bells and whistles need adding on.
 
Yup and as I have found out it is quite severely limited because they are developing for the consoles. So any shiny bells and whistles need adding on.

Doesn't really have much to do with consoles. They have to use several different codebases as each platform is either DX11/12 or something else based on Linux for PlayStation using it's own custom API.

So really it's little to do with it. The problem is DX12 needs a lot of extra work done and even for the companies exclusively on PC they are not investing into it because of the workload and the fact MS limits it to Windows 10. It's not worth it to invest so much time and money into when the TAM is quiet small. The ROI is too risky for gaming companies who are exlcusive on PC.
 
Even if the new cards come or not - for the price nvidia wil demand then it does not justify it. I would say a second hand 1070 is a great buy for the majority. Not taking about the few hardcore enthusiasts.

I my self have a 1080ti and a 8700k. And I am looking into pairing my 7600k with a secondhand 1070 itx card. Common sence tells me that its best to skipp this round of GPU's.

I 2019/20 will be good year to upgrade from a 1080ti if you want to game at utra.
 
Even if the new cards come or not - for the price nvidia wil demand then it does not justify it. I would say a second hand 1070 is a great buy for the majority. Not taking about the few hardcore enthusiasts.

I my self have a 1080ti and a 8700k. And I am looking into pairing my 7600k with a secondhand 1070 itx card. Common sence tells me that its best to skipp this round of GPU's.

I 2019/20 will be good year to upgrade from a 1080ti if you want to game at utra.

My last GPU before I bought the Titan XP was a Titan X (Maxwell). I paid £380 for it about a week before the 1070 and 1080 came out. It was dead even with a 1070, and retailers are still asking what I paid or more for the same card over two years later. lmao. Really?

Pascal is old hat now in technology terms. I certainly wouldn't pay anything NEAR to what Nvidia and retailers want for them.

As technology gets older it reduces in price. Then you are supposed to get more for that money when a new tech comes along. Not be paying the same base price for the same base performance you had before.

THAT is my gripe with Pascal. When it first came out the equivalent Maxwell card (so 1070 vs the 980Ti for example) cost the friggin same. And here we are over two years later paying.... The same.

Just another one of the many reasons why I have bailed on PC gaming. Thankfully I run a debt free ship so I don't need to sell bugger all, and I won't either because I know my XP will probably be still kicking ass two years from now, if the last two years are anything to go by.

The biggest problem of all for me with GPUs is that they are headed nowhere. We don't have all of these uber amazing games on the horizon that are in desperate need of new tech. We can now run 4k perfectly well, even on a console.

Take away the mining? Nvidia are going to have the hard sell. Ray Tracing *may* be the next step but that could take years before we see anything even remotely meaningful.
 
But the GTX 1070 was often cheaper or the same price as a 980Ti, drew far less power, was much easier to cool, had more VRAM, outperformed the 980Ti consistently, and continued to improve with driver updates in DX12. Yeah, you could find 980Ti's for £350 or occasionally less, but it's been this way for years. Consumers could often buy an old stock GTX 580, for example, instead of buying a GTX 670 for around the same price. Same goes for AMD. You could buy an old stock HD 6970 when the HD 7950 came out for the roughly same price. Performance difference between the two wasn't huge.
 
Back
Top