Intel claims gaming superiority over AMD's Ryzen 7950X with Raptor Lake i9-13900K

300 watt cpu, throttling after 20 seconds.... well done liz intel. :D


the 13900k is a joke. i hope the 13600K will be more efficient.
 
300 watt cpu, throttling after 20 seconds.... well done liz intel. :D


the 13900k is a joke. i hope the 13600K will be more efficient.
Short boosts is the way for best gaming performance, no way around that. And also many other workloads like photo editing, heck, even flicking through web pages and spreadsheets.
Obviously modern CPUs are quite overkill for most of that, but still.
 
The 13900k shares the same story as all of the AMD 7000 series. IE - make it mad and it will get very hot and use a lot of power. You need to treat it the same way as all of AMD's new CPUs. And that is by looking at how it performs how it should have been released.

I can't remember which review I skim watched today, might have been a GN. And they were saying how these CPUs are no longer being released in a state in which you should use them, simply to show up on charts.

Now BITD a CPU would be released well under that state, and well, that is how overclocking was invented and became a thing. These days? the MFGrs are so desperate to outdo each other in charts they clock the crap out of them themselves, leaving 0 headroom.

If you put a 7000 series where it ought to be? they are very efficient indeed. They bloody well should be on that node. Sadly AMD deem it necessary to make them look ugly by clocking balls out of them. Just to be on a chart. Power efficiency is now no longer cared about, so long as you look good on a gaming chart.

I just watched another video on YT titled "Don't bother overclocking a 13600k". Which says it all, IMO.
 
CPU vendors going way beyond diminishing returns with power consumption is rather ironic, when it's not long ago mobo vendors got scolded over multi core enhancement.
 
I have a 5950x at the moment and I was waiting until both the 7950x and 13900k were released to pick one. I was worried about the 7950x pumping out so much heat as the 5950x is already on the limit of what I would want heat wise but when I seen the 13900k it was an obvious choice. I will just put the 7950x into the 105 watt mode to hopefully be similar to the 5950x in terms of heat pumped into my small office.
 
I have a 5950x at the moment and I was waiting until both the 7950x and 13900k were released to pick one. I was worried about the 7950x pumping out so much heat as the 5950x is already on the limit of what I would want heat wise but when I seen the 13900k it was an obvious choice. I will just put the 7950x into the 105 watt mode to hopefully be similar to the 5950x in terms of heat pumped into my small office.
https://youtu.be/H4Bm0Wr6OEQ


Well I wouldn't call the decision obvious.
 
Exactly. Intel and AMD have really messed up here IMO. They have shown their CPUs in the absolutely worst light possible, which has basically torn up and thrown out any generational impressive improvements etc.

Overclocking to me was always an option, not forced on you. IE, you bought a 120w I7 950 and then you decided if you wanted to go over double the power use by overclocking it.

All of these new CPUs are not supposed to run like that IMO. As so many reviewers have said, it is now all about charts and who can pee the highest without bothering to show you the real generational improvements.

IPC is up on Intel. So is performance per watt. They have just nullified it all by making them behave the way they do. It won't even gain you that much in games, and just makes them really horrid to live with.
 
The 13900k shares the same story as all of the AMD 7000 series. IE - make it mad and it will get very hot and use a lot of power. You need to treat it the same way as all of AMD's new CPUs. And that is by looking at how it performs how it should have been released.

I can't remember which review I skim watched today, might have been a GN. And they were saying how these CPUs are no longer being released in a state in which you should use them, simply to show up on charts.

Now BITD a CPU would be released well under that state, and well, that is how overclocking was invented and became a thing. These days? the MFGrs are so desperate to outdo each other in charts they clock the crap out of them themselves, leaving 0 headroom.

If you put a 7000 series where it ought to be? they are very efficient indeed. They bloody well should be on that node. Sadly AMD deem it necessary to make them look ugly by clocking balls out of them. Just to be on a chart. Power efficiency is now no longer cared about, so long as you look good on a gaming chart.

I just watched another video on YT titled "Don't bother overclocking a 13600k". Which says it all, IMO.

What I've noticed though is, people are willing to pay massive amounts of money for tiny amounts of performance. In a GPU-bound instance, which most gamers are in, a €150 5600 or a €200 12400 is only a couple percent behind a 13900K. Yet gamers will buy the 13900K because it's faster in framerates they will never play at or have a monitor capable of displaying.

Many on this forum who don't do heavy multitasking workloads have very fast CPUs because they look better on charts, not because it actually nets them noticeably faster performance in the games they play.

I think it's understandable that AMD and Intel are gunning for chart-topping results—because it's clear that's what gamers are interested in, even when it's irrelevant to them. That's why I prefer reviews that show realistic workloads. GN and HWU are great in their reviews and I value the way they tests their processors. But it's clearly deceiving gamers into spending more than they should. How many are playing Rainbow Six Siege at 400 FPS? That's a niche bracket compared to the masses who aim for 60-144 FPS. If charts showed results in that frame rate range, we'd see far smaller gaps between each processor, making the likes of the 13900K, the 5800X3D, and even the 13600K seem unnecessary.

Take for example Techpowerup's review of the 13600K. In Cyberpunk 2077 at 1440p with a 3080, a very very common setup, sees the 10600K and 5600 all the way up to the 13900K and 7950X at 80 FPS.

The same test at 720p with a 3080 shows a 5600 at 127 FPS while the 13600K is 176 FPS. Huge difference but completely irrelevant to the majority of gamers.

If gamers did not care about chart-topping graphs, why are they buying a 13600K over a 5600 for twice the price?

From what I can tell, companies are just responding to what the market is asking for. That's why I don't like these unrealistic tests as much. Gamers are becoming deceived into applying those results to their own situation, meaning they have wasted their money.

That doesn't mean gamers shouldn't buy nice processors if they want to. But my point is, AMD and Intel are just responding to consumer's spending. If consumers stopped buying CPU's for 50 extra frames at 350 FPS that they will never play at, AMD and Intel might stop trying to top the charts. They'd instead focus on productivity and efficiency, something AMD started on with Zen.
 
Yeah see the thing is I still spend most of my time doing stuff other than gaming. A lot of heavy photoshop work (usually drawing decals) and stuff like that. As well as shifting a lot of stuff back and forth to the server and god knows what else. So I will still want the all rounder CPU, rather than just the best gaming CPU.

What I find most impressive about the 13600k is that it is pretty much as quick as the 12700KF, whilst costing quite a bit less now. I still only paid like £360 for my 12700KF, but that was when people were uncertain of Intel so I took a leap of faith. Prices went up shortly after, and never came back down.

It's a good point about the 5600. To me though it would not be enough. Like I said, because I still spend far more stuff doing a lot of stuff that still needs a lot of cores and etc. In gaming? oh sure, for now it is fine. I would have probably gone 5000 series had Intel not launched AL when they did. There was just no point for me though, given the 5900x was still far more expensive, slower in games and about the same overall on MT.

TBH I am not sure I would buy one now. Maybe if I had a weaker GPU I would consider it, but I still need mid to high end gaming performance with a 6800XT.

Good thing is my CPU overclocks really well too, but I have it in eco mode ATM, so there is deffo more gaming performance there for later in its life.

Oh and I fully expect the 7000 series to drop in price very soon. Especially the 7600x, which now looks like a ginger step child. No one is going to pay over 300 notes for that. It should be 250 or so IMO. The 13600k is simply better in absolutely everything else other than gaming. Which to many people is why they have a full sized PC and not a laptop.
 
Yeah, it makes sense for many to invest in a high-powered processor if they're going to take advantage of it.

But the 6800XT at 4k (I think you play at 4k, right?) is GPU bound. While you do need a 12700KF for other tasks, at that resolution with that graphics card, a lower end CPU would not bottleneck your GPU.

I also think AMD are having kittens over Raptor Lake. I know they have the 3D cache variant not far away, but unless they release a six version that's $300 and bring the 7600X down to $250, I don't see how AMD are going to be able to sell many Zen 4 chips other than the 7950X. And even that has stiff competition from the 13900K. I personally would rather the 7950X any day, but it is at least competitive when you factor in platform costs.
 
No mate 1440p. Both the gaming rigs I use for PUBG are 1440p. As usual it's optimal. Go 4k and you won't see anybody.

I do have a third rig now for 4k. 2080ti and my old threadripper. Stray looks amazing on my TV!
 
Go 4k and you won't see anybody.

15iblm.jpg


 
Yeah, it makes sense for many to invest in a high-powered processor if they're going to take advantage of it.

But the 6800XT at 4k (I think you play at 4k, right?) is GPU bound. While you do need a 12700KF for other tasks, at that resolution with that graphics card, a lower end CPU would not bottleneck your GPU.

I also think AMD are having kittens over Raptor Lake. I know they have the 3D cache variant not far away, but unless they release a six version that's $300 and bring the 7600X down to $250, I don't see how AMD are going to be able to sell many Zen 4 chips other than the 7950X. And even that has stiff competition from the 13900K. I personally would rather the 7950X any day, but it is at least competitive when you factor in platform costs.


With Intels better overall gaming performance and the fact you can still use DDR4 which is miles cheaper and has shown to have near zero difference in gaming vs DDR5 AMD need to bring their prices down... I don't know anyone from my circle who has bought into AM5.
 
With Intels better overall gaming performance and the fact you can still use DDR4 which is miles cheaper and has shown to have near zero difference in gaming vs DDR5 AMD need to bring their prices down... I don't know anyone from my circle who has bought into AM5.

I haven't seen a single person with an AM5 system yet.
 
I haven't seen a single person with an AM5 system yet.

Me either. No one on the other two forums I go on have bought it. I'd imagine there would be some adopters on OCUK but not bothered to look.

The thing is it was pretty obvious that Raptor lake would be at least good, because AL was excellent.
 
Back
Top