AMD Ryzen 3900XT, 3800XT and 3600XT Specs Leak

Just speculating here.
If i remember correctly the 3950X uses some of the best binned chiplets.
Now that initial demand for the 3950X would have fallen off after launch and yields have improved, releasing these might just be a case of, because we can. Also it creates more price brackets that AMD can compete in etc.
 
This is a media circus release just to get attention away from Intel only news. Intel did the same with 9900KS.

3950X was always in a tough spot. It is expensive enough that mainstream buyers don't consider it. The performance and price gap between 3950X and 3960X makes it compete in a different league. If I was building a workstation for around the money you will need for the 3950X system I would go Intel. 10920X/10940X are close enough and it gives you 48 PCI-E lanes. Which are very important for workstations. And you basically do very little rendering on CPU so that it beats Intel in rendering tasks doesn't matter.

It is possible that AMD wanted to reuse those binned chips in the market category that actually sells a lot of units.
 
This is a media circus release just to get attention away from Intel only news. Intel did the same with 9900KS.

3950X was always in a tough spot. It is expensive enough that mainstream buyers don't consider it. The performance and price gap between 3950X and 3960X makes it compete in a different league. If I was building a workstation for around the money you will need for the 3950X system I would go Intel. 10920X/10940X are close enough and it gives you 48 PCI-E lanes. Which are very important for workstations. And you basically do very little rendering on CPU so that it beats Intel in rendering tasks doesn't matter.

It is possible that AMD wanted to reuse those binned chips in the market category that actually sells a lot of units.

According to Amazon and Mindfactory the 3950X has actually been a hot seller given its price(Outselling all but two Intel processors for Mindfactory, making it one of the largest earning CPUs sold if you look at their revenue figures). Both platforms are matched on end PCIe bandwidth, but I guess you can buy a 3950X + mobo for like 30% less money than a 10920X + Motherboard, with 30% more cores.

Of course, this isn't too surprising given AMD are outselling Intel generally by like >500%.
 
Last edited:
According to Amazon and Mindfactory the 3950X has actually been a hot seller given its price(Outselling all but two Intel processors for Mindfactory, making it one of the largest earning CPUs sold if you look at their revenue figures). Both platforms are matched on end PCIe bandwidth, but I guess you can buy a 3950X + mobo for like 30% less money than a 10920X + Motherboard, with 30% more cores.

Of course, this isn't too surprising given AMD are outselling Intel generally by like >500%.

3950X is an amazing CPU there is no doubt about that. Even though theoretical bandwidth at the end might be the same, there are not many PCI-E Gen 4 devices. 48 PCI-E Gen 3 lanes are still much more than 20 Gen4 in the real world. Also I would rather go for Samsung 970 Pro over any current Gen 4 NVME for the same price.

On the very high and very low-end AMD has no competition. But right around that 3950X money you get much more with Intel X299.
 
But right around that 3950X money you get much more with Intel X299.
Can you practically though? Can you actually get any HEDT from Intel at all for the same platform cost? Maybe if you ignore the motherboards, but factoring motherboard cost you can get a 3950X setup with a decent motherboard for less than a 10-core 10900X with the cheapest X299 motherboard available here in the UK.

From a technical perspective Intel have some great CPUs here, but their pricing means they're getting pummelled into irrelevance on the market.
 
Last edited:
Can you practically though? Can you actually get any HEDT from Intel at all for the same platform cost? Maybe if you ignore the motherboards, but factoring motherboard cost you can get a 3950X setup with a decent motherboard for less than a 10-core 10900K with the cheapest X299 motherboard available here in the UK.

From a technical perspective Intel have some great CPUs here, but their pricing means they're getting pummelled into irrelevance on the market.

It would be a bit more, but if you need a lot of connectivity (dual GPU, capture cards, nvme storage) X299 is still a better overall workstation than X570.
 
I usually use a forum to gauge how many of something sells. Like I am the only one here stupid enough to have bought a 2080Ti Kingpin, for example.

However, loads of us have 3950x. It's the new HEDT chip now.
 
However, loads of us have 3950x. It's the new HEDT chip now.

In the end, it only matters what you are doing. Long has gone the time when 6950X was one CPU to rule them all. I, for example, use my PC for Music production, Gaming, Photoshop, and I need Thunderbolt. In all those tasks Intel is much better. I did consider AMD but it just doesn't tick my boxes. Many of my friends bought Ryzen computers on my recommendation and they are extremely happy.
 
Oh yeah no doubt Intel's CPUs have lots of areas where they're great, but ofc going back to the original point I think these 3950XT's/a likely 3950X price drop are going to have quite reasonable sales in comparison and the media attention is somewhat due.

Then I was just saying, cost wise a 3950X build is probably bang in the middle of a 10900K and 10900X, and for most people I think the cost/general perf. is usually their determining factor, which is probably why there does seem to be a stark sales difference between the companies for desktop processors atm.
 
Nah mate it's all about what's the hottest new thing. :P

Nah, it's not. Not for me at least. I don't usually buy the newest stuff in fact, I hadn't for many years. Before the 3950x my last upgrade was a £205 1920x. Before that a 14 core Broadwell E about four years ago and so on. I only ever buy a new high end CPU when I do a monumental update, and I only do that every 7 years or so. The last monumental upgrade I did was a Area 51 with a 5820k.

There's an old adage in PC land. "Spend as much as you can afford, do it once, do it right". Otherwise? you get stuck in incremental land and that gets expensive. Lower end hardware lasts ten minutes, then you buy more. And that is deliberately done to stop you getting a cheap decent deal. It is also why Nvidia stopped allowing people to buy, for example, two GTX 460s for £210 that would walk on a £450 GTX 480.

Now? you get the minimal entry fee (IE VRAM) on lower end GPUs, and they will expire within a year or two meaning you need to come back again. See also - GTX 1060 3gb and so on. The next casualty will be the 6gb cards, as the new consoles have lots more.

That is why it's best to just rip off the plaster, get it over and done with and then you can just leave it. The 3950x? I will have that for years. I already know what to expect from Ryzen 4000, and it will be nothing I will need for years.

GPU? I expect 3-4 years out of that easily. And I will get it, because I got it out of my Titan XP that cost me £650 used. That was top end ultra settings for over three years.

I mean yeah, some buy the latest thing because they want it and they can afford it. I don't. I buy what I need first, then if I can afford it what I want, then I just leave it to do its job I paid for. Chasing the dragon is bloody expensive. Ask Dice :p
 
I was just joking and wasn't targeting anyone in particular.

The value proposition is a bit difficult to predict in the long run, I've generally splurged for a nice CPU and then upgraded to mid end GPUs bi-yearly or so as rotating them to secondary setups or to 2nd hand market is pretty painless.
I've tried the "buy one now and a second card later" SLI path and I am never, ever doing that again. :D Ended up "downgrading" from 760 SLI to a 970 due to general simplicity and also input latency.
 
Now? you get the minimal entry fee (IE VRAM) on lower end GPUs, and they will expire within a year or two meaning you need to come back again. See also - GTX 1060 3gb and so on. The next casualty will be the 6gb cards, as the new consoles have lots more.
Tbf I don't think it's quite as strict/short term as that. The first game I ever came across that my 2GB HD7870XT couldn't play at 1080p with 40+ FPS was Warzone(CoD:MW was fine), games like Forza Horizon 4 or recent single player games were fine too. With the 3GB cards sure, but I think 6+ will be usable for a while, unless you're 1440p+ ofc
 
Last edited:
Yeah, SLi is most certainly dead. It was deliberate too, as Nvidia were giving stuff away. Two 460s, ttwo 560s, two 670s. Then SLi got really good, then they realised they were being idiots. Bring on the Titan, and the death of SLi.

Note how they haven't rushed to (understatement, they've not done a sodding thing) to even bother asking any one to get mGPU working in DX12? it's for that reason, they can't lock it down as it happens without a bridge IIRC.

But yeah, SLi died with the 7 series. It started to crack during the 9 series, as support was falling off a cliff and you needed two 970s which wasn't cheap.
 
VRAM is pretty easy to free by dropping texture and sometimes shadow quality. And the quality of textures even a 4GB (cough 3.5GB ) card allows is perfectly fine in my opinion.
 
Back
Top