The 8 core showdown and analysis thread.

AlienALX

Active member
I wrote this around six months ago. Things to note !

1. This is in, no way, shape or form, me trying to show Intel up. They've done that themselves charging a grand for a locked CPU.

2. Please note this is just a general thesis on core support and threading. Please note, in Windows 7 YMMV because it does not natively support more than four cores. Any further and you need to start applying patches and bodges for core parking ETC.

With that said pull up a seat and enjoy.



OK so I have been smashing the data. This thread can serve multiple purposes really.

* To see if an 8 core CPU is a viable proposition to a gaming PC.

* To see core use in games over the past year, to see if things have changed.

* To give a purpose to the forthcoming 8 core Haswell E CPU in a desktop machine.

* To see if the rumours that Xeons are crap for a gaming rig are true.

I'm going to compare an 8 core* AMD FX 8320 CPU clocked to 4.9ghz (that cost £110) with an Intel Xeon 8 core 16 thread CPU (socket 2011) that I also paid £110 for. Then I am going to analyse which games actually make use of all of those threads and how well they load up the CPU.

NOTES.

First up I'm fully aware the Xeon only boosts to 2ghz under load. I can not overclock it, not even in tiny increments via the FSB because even 101mhz makes the PC stick in a boot loop. So before the Intel boys dive in with accusations of comparing totally different clock speeds; there's nothing I can do about it. It's not my fault Intel decide to lock their CPUs at given speeds and then set a price structure for speed.

It's not always about speed and figures. At the end of the day a CPU can be perfectly suitable for a task, even if it does not appear as good as another one. You'd actually be amazed just how little CPU power you need for most of the time.

Heat and power are not a part of this analysis. Simply because I don't care, nor do I want to become embroiled in a stupid argument. This thread is strictly 8 cores only. I don't care about, nor want to know your results with your overclocked 4770k. Remember - 8 cores. I no longer care about clock speeds and IPC. I want to see more cores, being used, at lower prices. What I'd ideally like to see is a 6/8 core CPU by Intel that simply drops into a socket 1150 without the need to buy ridiculous motherboards or ram.

Hey, a guy can dream, right?

OK. So let's get it on then...

Here are the specs to concentrate on. The AMD rig is as follows.

AMD FX 8320 @ 4.9ghz
Asus Crosshair V Formula Z
8GB Mushkin Blackline running at 1533mhz (offsets with the FSB)
Corsair RM 750 PSU
Corsair H100
AMD Radeon 7990 ghz
OCZ Revodrive 120gb running RAID 0
Windows 8 Professional X64 (note, not 8.1 !)

Then onto the Intel rig. Note, this was a rebuild, so components stayed identical barring the board and CPU.

Intel Xeon V2 Ivybridge. 8 core, 16 thread, 2ghz
Gigabyte X79-UD3 motherboard.
8GB Mushkin Blackline running at 1600mhz XMP
Corsair RM 750 PSU
Corsair H100
AMD Radeon 7990 ghz
OCZ Revodrive 120gb running RAID 0
Windows 8 Professional X64 (note, not 8.1 !)

CPU validations.

AMD



Intel



I start with some benchmarks. First up was 3Dmark 11

AMD result



And the Intel



And already strange things happen. The Intel scored a higher physics score (which pertains to the CPU) yet even though the Intel also runs PCIE 3.0 (IB) it loses out overall. Very, very strange.

Then it was on to 3dmark (13) AMD up.



And then it was the Intel's turn.



TBH that's bloody, awfully close. It's actually within the margin of error but I promised myself before I began that I would not obsess over one benchmark and become sidetracked running it over and over again.

OK, so round three, Asus Realbench 2.0.

Interlude.

Asus Realbench is *the* most accurate benchmark I have ever ran in my entire life. Instead of making their own synthetic, unrealistic benchmark they simply took a bunch of programs and then mashed them together. This way the results are actual real world results. As an example, test one is GIMP image editing. Then it uses Handbrake and other benchmarks to actually gain a good idea of what a system is capable of.

This is also the toughest benchmark I have ever ran. I can run Firestrike all day long, but Realbench absolutely tortures a rig to the breaking point.

I ended up having to remove the side of the AMD rig and aim a floor standing fan at it to get it through.

So here is the AMD result.



And the Intel result.



Wow. Now this one truly knocked me sideways. I never expected the AMD rig to win on IPC alone (GIMP). Even an I7 920 runs the AMD close in GIMP, but the AMD absolutely trumped the Intel all the way through.

And this, lads and ladies, is why Asus make very high end boards for these chips. Simply as Bindi (an employee of Asus) points out, the AMDs are actually very good CPUs.

Then it was on to Cinebench, and another surprise..

AMD



And then Intel.



The surprise? not that AMD won. I was actually very impressed with the Intel's performance, given it is clearly running at less than half of the speed it's actually capable of. I hazard a guess that this CPU could actually double that speed if unlocked and overclocked, which does make me a teeny bit excited about Haswell E.

OK so no set of benchmarks would be complete without at least one game. I decided to choose Metro : Last Light. You'll see why later when I get onto the part about core use, but here is the AMD's result.



And the Intel.



And it was finally victory to the Intel. Not by much, but Metro clearly absolutely loves the cores and wants as many as you can throw at it.

Due to this result I decided to keep the Intel. There are other reasons of course, this played a part.



Less than 14 watts idle, and I had real trouble making it use more than 90w under load. Temps are always under 40c no matter what which means the rig is now very quiet.
 
OK so it's time for part two. The comparisons are now over, now it's time to crack on and see what those cores offer us in gaming.

With this part of the analysis I aim to debunk the myths. So, right now I am a mythbuster. Those myths include, but are not limited to -

"Pah. You don't need any more than four cores for gaming. Only a tiny handful of games use that many"

"No games support 8 cores, let alone more"

"Xeons are crap for gaming because they're somehow different inside"

OK, so those are the usual off the cuff comments doing the rounds. Here's my take. Basically I would take a CPU that's clocked to 2ghz with 8 cores and 16 threads over a quad core CPU clocked to within an inch of its life. Games and applications that use the cores tend to spread it around, meaning you get the same sort of performance without using masses of power or generating tons of heat. This is exactly why I decided to put a Westmere hex into my 670 SLI rig. Simply because in all of the latest games (and I mean all of them ) core count produced very similar, or better, results than running a CPU to within an inch of its life, thermal limits and voltage tolerance.

This is also why I chose to keep the 8/16 Intel in the rig, rather than putting back the more brutal AMD. I don't usually care about power use tbh. My bills are more than affordable, even with the stuff I have. However, noise is always a bonus if you can eradicate as much of it as possible.

Now I know I'm probably completely alone in feeling this way, but tbh? it's what we should have been demanding for years. More cores, better threading, lower power consumption to get to the same place, etc etc.

Sadly up until this exact moment Intel have not offered us a massively threaded CPU *with* the ability to overclock it. You could count socket 1366 but TBH this, IMO, was an oversight and a mistake by Intel. Had it not been so they would have left the strap and BCLK separate to the PCIE and SATA clock in 2011. Nope, Intel wanted to make sure you were stuck on their K series quad core chips. So it's always been "You can have the cores, but not the overclocks... Or the overclocks without the cores".

We are still going to be left to one side of course. Intel are already making and selling 12 core 24 thread CPUs but expect to see those in a year or two, unlocked and rebadged as "Extreme" edition CPUs.

Right, so with all of that said let's see what these cores can actually do, shall we?

Firstly I will explain how I performed this research. In Windows 8 (don't even think about using Windows 7, it does not correctly address any more than 4 cores. Anything more is a bodge and an afterthought and does not work properly) there is a very handy little app in your task manager than can be split up to show you how your cores are being utilised.

Here is how it looks once you split it to the full amount of threads, this is also a 8 core 16t system sitting idle.



OK. Now note that each core and thread (so physical and logical) has its own box. The bottom of that box is 0%, the top is 100%. As the graphs fill up they display core use.

Core use is recorded for 60 seconds, so basically the method I used was to load up a game, wait for it to get to an intense part of the action (when the rig makes the most noise, basically) and then press ALT and TAB to return to the graph. At which point I simply take a screen shot.

Note though - Actual utilization will not be accurate because I am now exited from the game. The only way to monitor that accurately is to run an accessory screen and record it in realtime. Not something I will bother with (yet).

So the first game put to the test was Batman Arkham City.



And as we can see, there is plenty of activity across all 16 threads. What we are looking for though is for the spikes to look the same. This indicates even loading over all threads. Very few of these games do this, but, there is still plenty of activity on each thread indicating high usage.

Then it was onto Battlefield 3.



No surprise. I've tested BF3 on the AMD and it wanted at least six cores with residual load over anything more.

Battlefield 4



OK, so we know BF4 is an 8 core game. However, BF4 tends to lean a lot less hard on the CPU. It seems DICE have been doing some work to make sure the GPU gets the harder job. Good job, DICE !

The first genuine surprise of the day, COD : Ghosts.



OK, looks like it could well be a modern console port then. Then it was on to Crysis 2. Again, I already knew what was going to happen here...



Only really wants four cores, spends most of its time leaning on two. Not so good then.

Time for Crysis 3



Much, much better. However, I am also aware that different levels in Crysis 3 change the dynamics. Certain levels want CPU cores, certain levels leave them to one side and call on the GPU. This is why the AMD vs Intel results in Crysis 3 are all over the place. However, I still say you're better off with the cores tbh. Time for Far Cry 3



And we can see that it's kind of lame. Far Cry 3 may use the same engine as Crysis 3 but it's clearly nowhere near as demanding, or complex. I admit the results were taken right at the very beginning of the game and thus, are likely to change as the game goes on.

Time for Hitman : Absolution.



Is one of the better games to demonstrate that it can thread very well. However, Hitman is a benchmark after all. I wish more games contained benchmarks tbh. OK so it was time to try something older.

RAGE by ID, using the TECH engine.



Wow. Now this game is what? three? four years old? yet still likes to thread itself well. God bless John Carmack, the guy clearly wants what I want.

Now let's see what Metro : Last Light is doing.



And again we see nice even core loads. It's pretty apparent that Metro loves having cores at its disposal. It allowed a CPU that won in absolutely nothing to beat another CPU simply by threading itself properly. Good show !

Time for some Tomb Raider benching.



Again, wants the cores. However, never truly loads them up over more than 50%. So in this instance again the Intel managed to produce better results than the AMD. Simply because the core loads are low, but the more cores you have the more the load is spread.

And finally it was time for Wolfenstein : The New Order.



And once again we see a pretty even distribution over all 16 threads. This is because the game is running on ID TECH, which Bethesda and others have been using for a while. This means more support for the future.

Conclusion.

Phew. I'm absolutely bloody knackered now. But, we can clearly see that the "Four cores is enough" argument belongs where it should be left; in 2010. Things have changed, consoles have changed, the ports over to PC have changed.

This may well pave the way perfectly for Intel's 8 core chip. TBH? like many times in the past they've left AMD to do all of the hard work on something that they saw as a waste of time. Now though? they have no choice really. Games and apps are now becoming more highly threaded by the day, and users will demand processors that can make full use of this.

I could well have sat here and benchmarked even more games that use the cores. There are quite a few that I am aware of, but sadly I only have so much time in the day.

Hope you enjoyed reading :) may well be a bit of an eye opener :)
 
Hmmm, not really convinced. Don't get me wrong - the 83xx chips are very great for budget gaming PCs these days, especially for the price but comparing it to a 1.6GHz Xeon seems a bit odd.

Xeons weren't made for gaming, I've got a 1230 V3 that would get out-performed by an i5 4670k at stock, and you can overclock the latter. In fact; you did overclock the AMD chip to nearly 5GHz while letting the Xeon run at stock.

As said, I enjoy that AMD 83xx series are best price/performance that you can get for this money, but this comparison seems a little useless to me personally. The AMD 8 cores perform about the same as an i5 quad core and that's about it. A much cheaper i7 quad core with Hyperthreading at 3.5GHz would beat the crap out of both the AMD and Xeon chips in games that use all 8 threads.
 
Hmmm, not really convinced. Don't get me wrong - the 83xx chips are very great for budget gaming PCs these days, especially for the price but comparing it to a 1.6GHz Xeon seems a bit odd.

Xeons weren't made for gaming, I've got a 1230 V3 that would get out-performed by an i5 4670k at stock, and you can overclock the latter. In fact; you did overclock the AMD chip to nearly 5GHz while letting the Xeon run at stock.

As said, I enjoy that AMD 83xx series are best price/performance that you can get for this money, but this comparison seems a little useless to me personally. The AMD 8 cores perform about the same as an i5 quad core and that's about it. A much cheaper i7 quad core with Hyperthreading at 3.5GHz would beat the crap out of both the AMD and Xeon chips in games that use all 8 threads.

I did point out that it was mainly to show core loads rather than being a competition. I compared them for the obvious reason really, they both have 8 cores. And the AMD, due to being unlocked, won in pretty much everything. £100 CPU vs £1000+ CPU. That's pretty impressive no matter who you support tbh.

As for Xeons not being made for gaming? well they ain't made for anything else now, given they are identical to their desktop part. You might get the extra cache on a £1000 CPU but never enough to make a real difference.

Every time I have posted this I seem to have upset some one, so again, I want to point out that it wasn't done to infuriate the blue team, it was simply because they're both 8 core and both live in my house.

As for the quad core beating the crap out of the AMD? that all depends how many cherries you pick. When the game is coded properly you generally tend to see results like these.

http://www.techspot.com/review/956-dying-light-benchmarks/page5.html

IE the code is limited by the GPU, so all systems come back with about the same results. Sure, you can make an AMD look bad by running poorly threaded titles but since when was that the fault of the CPU?

I've put my FX 8320 through an awful, awful lot of tests and IPC wise it's about the same as a I7 920 only you get 8 cores. That's an awful, awful lot of CPU for £100. If you can use them wisely for productivity? then they can give a £1000 CPU a bloody nose.

I'm a core whore. Always was, always will be. Reason being is I run a lot of very well threaded software so I can make use of them. As thus I will never be able to bring myself to 'diss a piece of hardware because of the awful software support.
 
Its the same argument as things like the tesla and quadro cards for rendering, they have other things that are not taken into account a xeon is designed to last and last in a server, an i7 or something or even lower is not designed for that kinda reliability, along with things like cache are needed for multiple server things.
Everything has a job tbh that £45 DUAL pent k does damn well for gaming.

Thats like using a lambo to go ASDA lol its still a car ... after all
 
Its the same argument as things like the tesla and quadro cards for rendering, they have other things that are not taken into account a xeon is designed to last and last in a server, an i7 or something or even lower is not designed for that kinda reliability, along with things like cache are needed for multiple server things.
Everything has a job tbh that £45 DUAL pent k does damn well for gaming.

Thats like using a lambo to go ASDA lol its still a car ... after all

+1 This
 
Its the same argument as things like the tesla and quadro cards for rendering, they have other things that are not taken into account a xeon is designed to last and last in a server, an i7 or something or even lower is not designed for that kinda reliability, along with things like cache are needed for multiple server things.
Everything has a job tbh that £45 DUAL pent k does damn well for gaming.

Thats like using a lambo to go ASDA lol its still a car ... after all

Bad car analogies = bad.

The latest Xeons are nothing but Haswell chips. They're also reliable. It's been years since Intel made any meaningful Xeons, heck, some of them even have the IGP intact.

Xeons these days? all they are are the desktop part with more cores. In some cases they don't even have that. In fact the Xeon with the IGP removed is actually identical to the 4790k only has no IGP and it's actually cheaper.

All I have here is a locked 8 core Ivy. Locked so that Intel can charge another grand for 300mhz more.

It's just a padlocking game these days dude. That's all.

The reason the Intel lost? the padlock. Had it been unlocked it would have stood up there with the 5960x. I bet it would have clocked higher too to offset the fact that Haswell has 5% over Ivy.

But that's not my fault is it? I wasn't the one who locked the multi, volts and strap was I? I just took what I found. £1000 worth of Intel CPU and benched it up against an AMD one costing £100.

Blame Intel for that.
 
Bad car analogies = bad.

The latest Xeons are nothing but Haswell chips. They're also reliable. It's been years since Intel made any meaningful Xeons, heck, some of them even have the IGP intact.

Xeons these days? all they are are the desktop part with more cores. In some cases they don't even have that. In fact the Xeon with the IGP removed is actually identical to the 4790k only has no IGP and it's actually cheaper.

All I have here is a locked 8 core Ivy. Locked so that Intel can charge another grand for 300mhz more.

It's just a padlocking game these days dude. That's all.

The reason the Intel lost? the padlock. Had it been unlocked it would have stood up there with the 5960x. I bet it would have clocked higher too to offset the fact that Haswell has 5% over Ivy.

But that's not my fault is it? I wasn't the one who locked the multi, volts and strap was I? I just took what I found. £1000 worth of Intel CPU and benched it up against an AMD one costing £100.

Blame Intel for that.

you could have taken the 5960X or even the other unlocked 6 cores and the AMD would lie in bed crying right now.
 
Next from the world of Vault-Tec...

Nissan GTR vs. Rolls Royce Phantom Drophead 2 door coupe showdown!!!

JR
 
And have you heard of binning, those CPU's may of been fine but they also have been tested and might not of been stable intel has its name to think of, hmmm yeah ill charge £200 for everything some good some crap or we sort the best and test them to make sure they can do we say they do and lock those that cant, so they still work but better to lock something than throw it in the bin and add to landfill.
Im sure if you look at most CPU's they are the same just with the odd imperfections in the cheaper ones. If they did not instead of £1000 for a CPU it would be much higher.

Yes more cores can be great look at GPU architecture, that is designed to be multi cores for many many calculations over needing fast precise calculations.

Im not going to argue but until you test literally every single processor this is not a valid test.

and even then its still not an even test as you are then testing different architecture with cpu and mobo interactions and software to cpu utilisation.

This test for the most part is like testing Water water tastes best and getting just 1 thirsty guy whos been in the dessert for 20 days... the answer would be the first one ...

And i thought my car analogy was pretty damn good if im honest.
 
Amazing really. You paint Intel in any bad light whatsoever* and people do get so upset.

Seriously there are times when a defence is a good idea and others when it's just bad.

Intel are the ones going around with their padlock, locking out enthusiasts. But that's OK 'cause it's binning, yes.

LOL it's a skank and you know it. Face it.

I wouldn't even mind if it wasn't totally deliberate. IE - they locked the multi to keep within temp and power limits but left the strap open. Kinda like Westmere Xeons. But no, lock lock lock lock.

*There is no defending that, sorry.
 
the xeon isnt that much cheaper, if you can afford 2 titan X a 300+ from a xeon to a 4960x shouldnt be that hard

Now you're trying to drag away from the point.

£100 CPU should not, repeat, should not, be able to beat Intel £1000 in anything, let alone near on everything.

You're misdirecting your blame and trying to pin it on me. I didn't make it ! they did.

Put down your bias and look at it clearly.

Locks are douchery. Nothing more, nothing less. As the clock speeds of these Xeons go up so does the price. You pay over £200 per extra 100mhz.
 
Haha i just explained why they do it, i think they have a bigger overall range to AMD... i think... so there for they can produce one chip and spread it out...
AMD do the same thing i was just saying.
I like AMD used to use them in my first gaming system, but i just prefer Intel for power consumption and heat waste.

I just want a fast PC thats quiet and my intel build has done just that.

is 8 cores is so good where is the graphs for 12 core CPU's?

And did i not explain why they Bin some things ... maybe because the cheaper ones cant handle it as well and become unstable.

You know what sod this ill just by an xbox or playstation ... cos that has a CPU and a GPU and it can play games ....
Why buy a titan when 980's can out perform them for gaming at 1080p (i know you have 4k, just mentioning it)

how much rendering do you do?
 
Last edited:
well xeons are not priced for the average consumer who sometimes renders a video or something, they are for servers or VM's and companies who need 16 threads for a VM or a server have enough pocket money.
The locks on xeons are understandable because they offer the highest reliability because of that.
 
Every time I have posted this I seem to have upset some one....

I genuinely have no idea why. Please next time just post something so stupid that you get banned immediately. This was a bit half assed IMO. Maybe get onto the subject of 970's and G-Sync that seems to go down terribly ^_^

JR
 
Haha i just explained why they do it, i think they have a bigger overall range to AMD... i think... so there for they can produce one chip and spread it out...
AMD do the same thing i was just saying.
I like AMD used to use them in my first gaming system, but i just prefer Intel for power consumption and heat waste.

I just want a fast PC thats quiet and my intel build has done just that.

is 8 cores is so good where is the graphs for 12 core CPU's?

They don't have a bigger overall range to AMD. They have their quad core market (and lower models, all of which come from the same prouction runs*) and they have their Enterprise models.

So they have two production lines. One cranking out DC and similar Xeons and one cranking out Haswell - E and similar Xeons.

AMD have Vishera, AM1, Jaguar, etc.

* run makes CPUs. They are then tested, cores lazered out, speed binned ETC. You don't need to speed bin a locked CPU any more than the actual clock speed.

Look man I like everything made by every one. I have both, including a 3970x which will do 5ghz for benching. In benching? yeah I can smash the AMD quite easily. I should hope so, it's a £800 CPU. However what may shock you is that in gaming I notice no difference at all no matter what the clock speed over my AMD. Well that's not strictly true, the AMD only has a 7990 so I can use more aliasing on the Titan Blacks.

But my point remains. For a gaming experience ? there is no discernible difference between the two CPUs. I'm just glad I 'only' paid £300 for my 3970x. It's fantastic for benchmarking but nothing more.

There are areas of the market where Intel dominate. IPC is one of them but only on two desktop models (not counting the E versions) because all of their others are locked.

So I repeat. I compared an 8 core CPU to another 8 core CPU. Yet I'm being told it's not fair. Hmm, funny that ! I don't think it's fair that two core games are used to determine which CPU should be the one you buy either but I ain't crying over it.

Good old brainwashing. Intel are very good at that. Here you are trying to defend a locked £1000 CPU. I wish I had that sort of control over people.

I genuinely have no idea why. Please next time just post something so stupid that you get banned immediately. This was a bit half assed IMO. Maybe get onto the subject of 970's and G-Sync that seems to go down terribly ^_^

JR

I posted two sections of this. One as a shits and giggles test of two CPUs I own and one to demonstrate core use.

I deliberately and logically pointed out that you SHOULD NOT focus on the first part it was me just playing with my toys yet look at what's happened. No one cares about the actual solid thesis or the data, they just have their noses out of joint because Intel lost.

LOL and you call me stupid?

Yeah I'm incredibly stupid. :o
 
Last edited:
Back
Top