Intel 13600K and 13900K DDR4 vs DDR5 Showdown

Great work once again Guv!!!

So, I'll be upgrading my Z690 DDR4 rig to a 13600K and I've also got my B550 5800X3D rig; I think that covers all the bases for me moving forward :D
 
Excellent comparison, I find the top of the line comparison more relevant with these systems than using some value DDR4 as it is the "budget option". I've invested in b-die, I want to keep using it. :D


I find the differences in minimum framerates very interesting, and it is also the most meaningful metric in my opinion. You feel those spikes, not average fps.
 
I find the differences in minimum framerates very interesting, and it is also the most meaningful metric in my opinion. You feel those spikes, not average fps.

Exactly. I watched another review video today and they would say there is a 6% difference in X game. Yeah, the averages. Not the minimum. The minimum and .1% lows are the only thing I care about.
 
Somehow, I find this very worrying. Maybe you guys can take this from me.



As it stand, currently, DDR4 = better for gaming, but could currently existing games get patches to benefit more from DDR5 or does that not lie within a game's engine, but moreso in the OS or CPU itself?


I don't really want to buy "old" tech if the possibility exists that the new tech becomes better even in older games over time.
 
Somehow, I find this very worrying. Maybe you guys can take this from me.



As it stand, currently, DDR4 = better for gaming, but could currently existing games get patches to benefit more from DDR5 or does that not lie within a game's engine, but moreso in the OS or CPU itself?


I don't really want to buy "old" tech if the possibility exists that the new tech becomes better even in older games over time.

The fact is, when you buy a CPU, you do so all at once and then don't change it over time. What I mean is, unlike a GPU it is doubtful you will change out anything you bought (IE - board CPU RAM). Unless you need more RAM of course, but usually people buy too much rather than not enough.

DDR5 is faster in a couple of games. But that is a couple of games, not the whole picture. Maybe three years from now the tables will turn and DDR5 will win the majority, but that is three years from now, when you will likely be considering replacing the whole lot any way.

New, faster RAM always takes an age to show any benefit whatsoever. It's always been like that. When DDR3 came out it was nearly 3x as expensive as DDR2. So I stuck with DDR2 for over two years.

Now see I got lucky. When I built my rig in 2020 I found some DDR4 4133 for the same price as 3600. It was on sale. Even though I did not need it at the time (because I was running a 3950x) it definitely swayed me into getting the 12700KF. Because I knew I would lose pretty much nothing over DDR5. And I was right, too. My latency is waaaay lower than DDR5.

I wouldn't worry about it mate tbh. It's kinda like PCIE5. Again, it is not worth having yet, and it could be years before it is. I mean FFS I don't even think any one NEEDS PCIE4 unless they are running an X4 GPU.

Edit in. Once again, once you hit CPU limits (you will with a 13 series or 7000 series) it is all about the GPU. Faster RAM will not change that. So always put that extra money you could have wasted on something ultra shiny and new (yet usually pointless) into the GPU.
 
Last edited:
I agree they are pushed to the limits, Intel slightly better showing with the 13600k the 7700x is just to much of a buy in for a midrange cpu due to the mobo costs on amd.

The biggest problem with anything new is of course it's better but the software isn't catching up fast enough to the hardware and most GPU's dont even need the bandwidth yet.

I'm glad i went 5800x and didnt wait for the new things didnt need to buy anything just bios update and done happy days, it'll be a while before i do anything atm.

The last gen is super value for the time that it will last, i got the 5800x for a steal in my eyes i think i paid just over £220 for it, if i had £750 spare i'd get a 3090 and forget hardware for many years but i dont so my 6800xt will last a few more gens yet overall my system is fine no issues for me.

The software needs to catch up no point having all the new things when not even direct storage is a thing yet barely any games that need the power or use the latest and greatest ticket software items it's all like having a sports car atm and driving on the road to the local shop totally pointless when you would want to be on a track to fully enjoy something like that.
 
Exactly. I watched another review video today and they would say there is a 6% difference in X game. Yeah, the averages. Not the minimum. The minimum and .1% lows are the only thing I care about.

I haven't understood the whole minimum and especially the 0.1% lows, to me that just looks like Chinese or Russian to be honest. Have no idea what it is I'm actually looking at, hence mostly just looked at the average FPS, when it's most likely the least interesting to me in all honesty.
 
I haven't understood the whole minimum and especially the 0.1% lows, to me that just looks like Chinese or Russian to be honest. Have no idea what it is I'm actually looking at, hence mostly just looked at the average FPS, when it's most likely the least interesting to me in all honesty.

The average is the part you should ignore. It's just an average, of the maximum and minimum FPS. It is that difference (between max and min) you should be looking at. It is what defines a smooth gaming experience. A 50 FPS frame drop is what you will see. Steady wins the race, so the average results are to be ignored.

For example let's say I have a max FPS of 230 in PUBG. My minimum is say, 90. Believe me if it swings between the two you are going to see it, no matter how good adaptive sync technology is. The average is generated by a calculator, and means nothing at all. Because it is a static number they have worked out from knowing what the min and max were.

Oh and 1% and .1% lows have only been a thing for about 10 years. That is how AMD managed to rip so many people off !. When you ran Crossfire you could only generate so much information with the tools you had. Which at that time was FRAPS. The problem? well it could read the minimum FPS (so could I if I had a counter and stared at it) and the maximum. Again, you could see that. The issue was what it was hiding. IE - frame time. Or, the time it takes to deliver that frame. And whether it was a whole frame or a partial frame (IE with some of it dropped) or a runt frame (which was counted but you didn't even get to see).

It was only when Ryan at PCPER (who now works in Intel's GPU department) used a tool called FCAT that basically outed AMD. At which point AMD were forced to you know? make Crossfire actually work on your actual computer without being total chit. And they did, but the maximum FPS, REAL minimums and the new data showed it in a much poorer light. IE, those "fake" runt frames it was delivering were fooling people. And when it delivered a partial frame (or a bad 1% low?) you would see a stutter absolutely every single time. Yet people refused to believe it at first, and there was a bit of a witch hunt.

I had used SLi for some time at that stage. And I went over to Crossfire and I absolutely hated it. That was back when people could use the excuse of "No stutters here !" and "It must just be you" and so on. Because there was no scientific data to back it up. And usually multiple GPU systems were being ran by posers who TBH? were probably running benchmarks all day so never even noticed the god awful terrible mess it was. And on enthusiast forums? it was far more of them than people who had a rotten gaming experience.

It never EVER happened in SLi. Because Nvidia actually spent a lot of money on that.

Oh and BTW dude, the minimum FPS is the most critical part. And the difference between it and the maximum. The higher the minimum? the less chance you will feel a change when the FPS drops. So the higher you can maintain the minimum the less noticeable it becomes when it drops suddenly. Like, imagine standing all of the way up straight, and then dropping your body by 50 percent, compared to just ducking down a tiny bit. The more it is the more you are going to feel it.
 
Last edited:
Why would they lie?

Absolutely no reason. I caught quite a few websites posting wrong things by mistake. Copy and pasting can have that effect.

The thing is, if it's using a 2080 Ti, that would have introduced a GPU bottleneck, and would make this comparison less meaningful.
 
Somehow, I find this very worrying. Maybe you guys can take this from me.

As it stand, currently, DDR4 = better for gaming, but could currently existing games get patches to benefit more from DDR5 or does that not lie within a game's engine, but more so in the OS or CPU itself?

I don't really want to buy "old" tech if the possibility exists that the new tech becomes better even in older games over time.

From what I currently understand the reason a few games show better performance on DDR4 and the vast majority show basically the same performance between both comes down to latency, While DDR5 does work slightly differently I think its higher timings are overall detrimental to gaming performance vs a 3600-4000MHz DDR4 kit with very low/tight timings.

I don't think this is something that can be fixed via a software update to an engine but more so the hardware manufacturers need to get the timings on DDR5 low enough that they actually show game performance benefits vs older DDR4 with tight timings.
 
It's benefits aren't what make it detrimental. Its just the fact games are rendered and processed in the ms, once the memory requirements are fulfilled, the only thing holding it back is overall latency. DDR4 currently satisfies memory bandwidth. So anything above that is moot which makes DDR5 redundant. Once we get more memory intensive gaming workloads, things will change. When this crossover point occurs is anyone's guess.
 
Last edited:
It's benefits aren't what make it detrimental. Its just the fact games are rendered and processed in the ms, once the memory requirements are fulfilled, the only thing holding it back is overall latency. DDR4 currently satisfies memory bandwidth. So anything above that is moot which makes DDR5 redundant. Once we get more memory intensive gaming workloads, things will change. When this crossover point occurs is anyone's guess.


Today I learned ^_^
 
It's benefits aren't what make it detrimental. Its just the fact games are rendered and processed in the ms, once the memory requirements are fulfilled, the only thing holding it back is overall latency. DDR4 currently satisfies memory bandwidth. So anything above that is moot which makes DDR5 redundant. Once we get more memory intensive gaming workloads, things will change. When this crossover point occurs is anyone's guess.

This is a great explanation. I didn't know these things either.

It means it's kind of an awkward time to build a new system, even though they're so much new stuff. We have no idea when DDR5 will start to become the leading specification.
 
This has been one of my main crunch points.

Do I stick with the DDR4 I have now and go with a cheaper motherboard knowing I would have to (or want to) upgrade to DDR5 when it starts making a bigger difference, or go all out now and get a better motherboard and DDR5 straight away.

But then I ask myself when will DDR5 actually start making a difference and what will the price and speeds be like then!?
 
This has been one of my main crunch points.

Do I stick with the DDR4 I have now and go with a cheaper motherboard knowing I would have to (or want to) upgrade to DDR5 when it starts making a bigger difference, or go all out now and get a better motherboard and DDR5 straight away.

But then I ask myself when will DDR5 actually start making a difference and what will the price and speeds be like then!?

If the world will even exist by then… Looking at the state the world is in and not to begin speaking of the climate, with floodes, tornados, extreme heats during the summer and it will only get worse.
 
This has been one of my main crunch points.

Do I stick with the DDR4 I have now and go with a cheaper motherboard knowing I would have to (or want to) upgrade to DDR5 when it starts making a bigger difference, or go all out now and get a better motherboard and DDR5 straight away.

But then I ask myself when will DDR5 actually start making a difference and what will the price and speeds be like then!?

By the time DDR5 is the only option for a platform you will want to get some pretty speedy stuff anyway with low/tight timings as then DDR5 makes sense, DDR5 right now is very expensive and in the vast majority of games, Some outliers here and there, Gives virtually zero perf uplift and in some cases shows regression.

If the world will even exist by then… Looking at the state the world is in and not to begin speaking of the climate, with floodes, tornados, extreme heats during the summer and it will only get worse.

Then world will be fine.
 
Back
Top