AMD slashes their RX 5700 series pricing ahead of launch

Very good value now.

Also curious if they set prices higher than normal to see what Nvidia would do and then undercut them anyway.
 
Much more realistic but still more than it should have been, IMO.

They may have planned this all along. Make the price stupid before it's even released, wait for Nvidia to respond before it's even released and "drop" the price before it's even released and look like you've done everyone a huge favour whilst still getting over the top prices.

And yes before someone says it things like that happen all the time. All it takes is some marketing intelligence, smoke and mirrors and you're quids in.
 
Very good value now.

Also curious if they set prices higher than normal to see what Nvidia would do and then undercut them anyway.

Yeah, I'm thinking that's it. At first I thought they were being greedy trying to cash in on Nvidia's new pricing standards and got bitten by it big time. They still have done that a little, but it looks like they anticipated dropping their prices.

I'll be quite interested in a 5700 or a 5700 XT if I can sell my GTX 1080 for £300 or thereabouts. I know it's totally a side-grade, but I would like an all-AMD build finally and nothing else from either companies interest me.
 
I don't think this was planned, it's not really a segment shift, more the kind of adjustments we usually get between paper launch and actual launch day pricing(RTX Super prices had a similar change on launch day didn't they?). Seems like the GBP prices have changed by £10-20 from what they would have been. There'd be no point using it as smoke and mirrors because small changes to pricing is the one thing a tech company can change on a dime and NVidia is no different.
 
Last edited:
Yeah, I'm thinking that's it. At first I thought they were being greedy trying to cash in on Nvidia's new pricing standards and got bitten by it big time. They still have done that a little, but it looks like they anticipated dropping their prices.

I'll be quite interested in a 5700 or a 5700 XT if I can sell my GTX 1080 for £300 or thereabouts. I know it's totally a side-grade, but I would like an all-AMD build finally and nothing else from either companies interest me.

Deffo wait for reviews because I've seen hints this could be another hot potato. Remember, they're outgunned and outnumbered so they'll have to push them hard and - eeww, blower.
 
The 5700XT is a 225W part and 5700 is 180W so it's roughly high-end Polaris power draw, somewhat larger dies so should be slightly cooler.
 
Last edited:
Yeah, I'm thinking that's it. At first I thought they were being greedy trying to cash in on Nvidia's new pricing standards and got bitten by it big time. They still have done that a little, but it looks like they anticipated dropping their prices.

I'll be quite interested in a 5700 or a 5700 XT if I can sell my GTX 1080 for £300 or thereabouts. I know it's totally a side-grade, but I would like an all-AMD build finally and nothing else from either companies interest me.

I would personally wait for after market cards. That's something AMD needs to fix. They need aftermarket cards from the start. Not a month later.

The first impression is the most important. They never cash in on that with crappy blower cards.
 
I would personally wait for after market cards. That's something AMD needs to fix. They need aftermarket cards from the start. Not a month later.

The first impression is the most important. They never cash in on that with crappy blower cards.

As much as I don't like blower coolers, there is a clear reason why they are used for "reference" designs. There is a reason why Nvidia's latest Founders Editions are not reference, and it's not just because of their factory overclock and pricing.

Blower coolers work well with every PC, requiring little thought when it comes to airflow. Avial designs require great case airflow to work optimally, whereas the "out the rear" exhaust nature of blower coolers simplifies things significantly.

From a design perspective, a blower style reference cooler works well in basically all environments, especially in small form factor builds and OEM systems.

I do agree that AMD needs to work with OEMs to have launch ready custom cards. Enthusiasts don't want blower cards.
 
can´t wait for intel to step into the market so we have at least 3 players.


todays GPU prices are nonsense.



i don´t agree with them on everything they say about the current GPU situation but on many things (and i like that i am not the only one upset).

there are enough chills in the media who will accept basically every stupid price these two companys will set.

https://www.youtube.com/watch?v=oh0G38YO-1A


https://www.youtube.com/watch?v=InBIs4M9BsY&t=435s
 
Last edited:
GPU prices will drop properly once we can start to properly improve yields on the transistor counts required for modern high speed cards with EUV. At the moment we're reliant on messy and expensive several stage multi-layer patterning so until then we're in a bit of a technological no mans land, but industry still has to push forward with new models for segments where even the small price/perf gains will pay off and the pricing is a reflection of that.

There's been no reason to upgrade your GPU every gen for quite a while now and lines like Turing were never expected to be the kinda thing 1000 series owners would go out and buy, the most popular NVidia GPU in desktops is still the GTX960 iirc. Gamers should be expecting meaningful generational/value proposition updates about every 4 years with where GPU maturity is at now.
 
Last edited:
GPU prices will drop properly once we can start to properly improve yields on the transistor counts required for modern high speed cards with EUV. At the moment we're reliant on messy and expensive several stage multi-layer patterning so until then we're in a bit of a technological no mans land, but industry still has to push forward with new models for segments where even the small price/perf gains will pay off and the pricing is a reflection of that.

There's been no reason to upgrade your GPU every gen for quite a while now and lines like Turing were never expected to be the kinda thing 1000 series owners would go out and buy, the most popular NVidia GPU in desktops is still the GTX960 iirc. Gamers should be expecting meaningful generational/value proposition updates about every 4 years with where GPU maturity is at now.


a bit unrelated but that was exactly what i was hearing around 2015/16 when it came to pico CPU updates from intel.

excuses for why kaby lake was ony 3-5% faster etc.


maxed out processes.. no gains to have.... lower your damn expectations.
4 cores are anyway all a consumer needs.


i was telling people intel was just sitting on their hands with basically no competition.
they just had to do minor updates and efficency increases and the review sites were saying "hey look here what great things intel has done".


and it basically was that way since sandy bridge.

that´s why people who in the 90s and 2000s switched a cpu every 18 month now still have a sandy bridge.

then ryzen came.


the main problem in the GPU market is not process issues.. it is lack of competition.
 
Last edited:
The lack of competition explains the pricing somewhat(Notice Intel maintained roughly the same consumer pricing structure all through Zen/Zen+ gen still), not the progression of technology, where the vast majority of the baseline work is carried out in academia and then commercialised by companies as soon as it's economically viable, Ryzen was good at making things cheaper but they've only caught up to Intel in performance now that Intel have had major problems with their ambitious 10nm node. CPUs have still continued to slow in the rate of technological progression even counting for Zen2, and GPUs will be no different.

High bandwidth MCMs are just becoming viable for CPUs use cases so we're finally becoming unhindered by the slowing of Moores law in that sphere to help bring prices down, but GPUs require far more bandwidth than silicon interconnects can deliver at the moment.

Designs on high complexity microprocessor designs like CPUs and GPUs generally begin over 4 years before their commercial release(Nowadays large companies have to use leapfrogging design teams), so the idea that a company can design reactively to their competitions current positioning is a little crazy.
 
Last edited:
Back
Top