Intel Skylake i5 6600K & i7 6700K 1151 Z170 Review

The latest generation of Intel CPUs is finally here. We look at the top i5 and top i7, as well as run you through a brief overview of the Z170A chipset.


Intel Skylake i5 6600K & i7 6700K 1151 Z170 Review

Tom, what's the word on ITX motherboards? So far as I can tell the only official ones right now are the ASRock and EVGA, and both appear to be lower end. The Gigabyte was featured in the anandtech roundup, but isn't on the Gigabyte webpage, and Asus and MSI are MIA.

What gives?

My wife and I bought the Corsair Graphite 380T on June 1 when it was on sale at an insanely low price at NewEgg (probably a summer vacation deal for the kids), and have been holding it for a Skylake ITX build, and the two main boards we wre expecting to be at the top of our list are missing in action!

Have you heard anything from Gigabyte on the Z170N Gaming 5, or Asus on the Impact, or MSI on whatever the heck they will be calling theirs?
 
Last edited:
Tom, what's the word on ITX motherboards? So far as I can tell the only official ones right now are the ASRock and EVGA, and both appear to be lower end. The Gigabyte was featured in the anandtech roundup, but isn't on the Gigabyte webpage, and Asus and MSI are MIA.

What gives?

My wife and I bought the Corsair Graphite 380T on June 1 when it was on sale at an insanely low price at NewEgg (probably a summer vacation deal for the kids), and have been holding it for a Skylake ITX build, and the two main boards we wre expecting to be at the top of our list are missing in action!

Have you heard anything from Gigabyte on the Z170N Gaming 5, or Asus on the Impact, or MSI on whatever the heck they will be calling theirs?

Just a case of waiting for now dude
 
@tinytomlogan Did Intel ( or someone else ) provide you some "official" safe voltages for these Skylake chips? They seem to need even higher voltages than 22nm Haswell-E big cpus

Did Intel manage to make them more resistant to higher voltages or they're just bad overclockers that will eventually die after 1 year at 1.4V?
 
@tinytomlogan Did Intel ( or someone else ) provide you some "official" safe voltages for these Skylake chips? They seem to need even higher voltages than 22nm Haswell-E big cpus

Did Intel manage to make them more resistant to higher voltages or they're just bad overclockers that will eventually die after 1 year at 1.4V?

If you read the review youll see the CPU vcore and the Cache volts are now linked and that is where the extra voltage comes from
 
If you read the review youll see the CPU vcore and the Cache volts are now linked and that is where the extra voltage comes from

Can't find that part .-. is it on page 3&4?

"Linked" in what way? Are the CPU cores actually taking all that voltage or not?
 
Last edited:
I really like the fact that they gave people more settings to tinker with. Haswell OC guides EVERYWHERE on the net were the exact same thing.
1 Go up the multiplier
2 Stability test
3 Volt tweaking if fail, then test again
4 Goto 1

Also, Tom do you have any of those boards for z170 that have DDR3 instead of DDR4? It is a great oportunity to compare those two technologies!
 
Just a case of waiting for now dude

I for one would probably make the switch to Z170/Skylake if there was an Asus Impact board available with the same stunning looks and performance as the Z170 based ROG range. I am not impressed with the EVGA mITX offering and its mere 4 power phases seeing as the next build I do is going to be the first fully custom loop and the whole idea is how far I can push an mITX rig. Ordered the Case Labs Nova X2M case 2 weeks ago and hope to start the build when it gets here by the end of August

--Rick--
 
I for one would probably make the switch to Z170/Skylake if there was an Asus Impact board available with the same stunning looks and performance as the Z170 based ROG range. I am not impressed with the EVGA mITX offering and its mere 4 power phases seeing as the next build I do is going to be the first fully custom loop and the whole idea is how far I can push an mITX rig. Ordered the Case Labs Nova X2M case 2 weeks ago and hope to start the build when it gets here by the end of August

--Rick--

Impact is deffo coming but they will give these mainstream ROG boards a chance first
 
Tom,

Given that the 5820k can be got for £293-£300 and the 6700k is £286-320 which do you think offers better value for someone such as myself who is rocking a Core 2 Quad Q6600 still.

At the minute I'm torn between picking up a 4790k if I can get below £200 or going either 6700k or 5820k (the difference for me is about £40 between 6700k and 5820).

I'll be hopefully pairing my build with the Fury Nano :)
 
Tom,

Given that the 5820k can be got for £293-£300 and the 6700k is £286-320 which do you think offers better value for someone such as myself who is rocking a Core 2 Quad Q6600 still.

At the minute I'm torn between picking up a 4790k if I can get below £200 or going either 6700k or 5820k (the difference for me is about £40 between 6700k and 5820).

I'll be hopefully pairing my build with the Fury Nano :)

I would get a 5820k every day dude. I know I'm not Tom, but trust me, I've seen what my one can do and it will beat a quad core all day..

Look to the future. DX12 will apparently use CPUs properly, there's no contest.
 
I would get a 5820k every day dude. I know I'm not Tom, but trust me, I've seen what my one can do and it will beat a quad core all day..

Look to the future. DX12 will apparently use CPUs properly, there's no contest.

What do you mean by properly?

Whats your typical powerdraw when you overclock your 5820k and its under load?..I can get a Corsair AX760 for £85 or I can get the RM1000 for £40 more. In an ideal world I'd like to pair the processor with a Fury X2 when it arrives but realistically it will be a Fury Nano and maybe crossfire.
 
What do you mean by properly?

Whats your typical powerdraw when you overclock your 5820k and its under load?..I can get a Corsair AX760 for £85 or I can get the RM1000 for £40 more. In an ideal world I'd like to pair the processor with a Fury X2 when it arrives but realistically it will be a Fury Nano and maybe crossfire.

I mean address all of the cores a CPU has instead of one. Up until Windows 8 no more than four cores were used by the OS itself so people had to park cores and use all sorts of nasty hacks.

158w @ 4.6ghz 1.28v

There's a very slim chance you will get 4.6ghz stable, aim for 4.2.
 
Look to the future. DX12 will apparently use CPUs properly, there's no contest.

I keep feeling that dx12 will not be the magical thing so many believe it to be. Firstly game developers will have to implement it, secondly, how many dx12 games are on the horizon? So far just one. I predict that the GPU will remain the thing to offload tasks to rather than the CPU, makes more sense IMHO. By the time that those few rare dx12 games utilising more than 4 cores games arrive Skylake-E has arrived. Meanwhile Haswell-E owners might be playing games with 2 cores and twelve threads just sitting there and lower clocks, thus actually losing performance a higher clocked quad core offers. Hence I feel that Haswell-E remains a workstation chip and not a gamer's choice. But hey, maybe I'm wrong :)
 
I mean address all of the cores a CPU has instead of one. Up until Windows 8 no more than four cores were used by the OS itself so people had to park cores and use all sorts of nasty hacks.

158w @ 4.6ghz 1.28v

There's a very slim chance you will get 4.6ghz stable, aim for 4.2.

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Core i7-5820K 3.3GHz 6-Core Processor (£293.00)
CPU Cooler: Corsair H100i GTX 70.7 CFM Liquid CPU Cooler (£92.63 @ Amazon UK)
Motherboard: ASRock Fatal1ty X99M Killer Micro ATX LGA2011-3 Motherboard (£185.72 @ Amazon UK)
Memory: Corsair Vengeance LPX 16GB (2 x 8GB) DDR4-2666 Memory (£102.75 @ Ebuyer)
Storage: Samsung SM951 256GB M.2-2280 Solid State Drive (£119.10 @ Scan.co.uk)
Storage: Samsung 850 EVO-Series 1TB 2.5" Solid State Drive (£259.99 @ Amazon UK)
Case: Corsair Air 240 MicroATX Mid Tower Case (£70.58 @ More Computers)
Power Supply: Corsair RM 1000W 80+ Gold Certified Fully-Modular ATX Power Supply (£124.98 @ Novatech)
Total: £1248.75
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-08-07 21:50 BST+0100

I'm trying to decide between that Corsair RM1000 or an AX760

At most I'll be putting Fury Nano's in crossfire (fingers crossed)..Would 760W be enough considering I'd be overclocking the 5820k or should I go safe and go 1000W?

I keep feeling that dx12 will not be the magical thing so many believe it to be. Firstly game developers will have to implement it, secondly, how many dx12 games are on the horizon? So far just one. I predict that the GPU will remain the thing to offload tasks to rather than the CPU, makes more sense IMHO. By the time that those few rare dx12 games utilising more than 4 cores games arrive Skylake-E has arrived. Meanwhile Haswell-E owners might be playing games with 2 cores and twelve threads just sitting there and lower clocks, thus actually losing performance a higher clocked quad core offers. Hence I feel that Haswell-E remains a workstation chip and not a gamer's choice. But hey, maybe I'm wrong :)

There's always that risk with any new tech though I guess.....I can't find any clear proof that DirectX12 will use CPU's better...If I look at current games such as Battlefield, Witcher 3, GTA, they all seem to use multiple cores....The only downsides I'm seeing of X99 so far is power consumption. Any more I should be aware of?
 
Last edited:
I keep feeling that dx12 will not be the magical thing so many believe it to be. Firstly game developers will have to implement it, secondly, how many dx12 games are on the horizon? So far just one. I predict that the GPU will remain the thing to offload tasks to rather than the CPU, makes more sense IMHO. By the time that those few rare dx12 games utilising more than 4 cores games arrive Skylake-E has arrived. Meanwhile Haswell-E owners might be playing games with 2 cores and twelve threads just sitting there and lower clocks, thus actually losing performance a higher clocked quad core offers. Hence I feel that Haswell-E remains a workstation chip and not a gamer's choice. But hey, maybe I'm wrong :)

There's always that risk with any new tech though I guess.....I can't find any clear proof that DirectX12 will use CPU's better...If I look at current games such as Battlefield, Witcher 3, GTA, they all seem to use multiple cores....The only downsides I'm seeing of X99 so far is power consumption. Any more I should be aware of?

I'm not the most qualified person to talk about APIs, however I should clarify a few things that I do know.

Before I start with the longer part, should note that porting over to DX12 isn't that hard and really depends on how the current game's pipeline was created(iirc from SPS our very own Game Dev)

DX12 isn't about the GPU. It's purely CPU focused as that along with memory bandwidth are always the bottleneck. Now as you guys mentioned, DX11 can use multiple cores, however it is very limited and not exactly ideal. With DX11, it wasn't that using multiple cores was the problem, it was the API and scheduling that was the issue. With DX11 everything was serialized from the CPU, meaning it would only send one tasks at any given time to the GPU and when you have 4 cores and 8 threads, it makes the tasks list quiet long and therefore you run into an optimization issue. You did not really have much control over what the API or the Driver was doing which further introduced problems. This is why you have been hearing about the terms Draw Calls and Batch Counts with Mantle and DX12, because what they did was unserialize the scheduling and introduced parallel scheduling- which means Draw Calls and Batch Counts are drastically increased which therefore means, more units, more independent light sources from says thousand of units firing their weapons, etc. This is the advance we are seeing. By making them parallel and also allowing closer access to hardware, they have more control and can get the GPU feed faster. That is why the GPU gets better, because the CPU feeds it faster. That's really it tbh. Especially with AMD hardware, their ACE(asynchronous compute engines) on their GCN cards are designed for these parallel tasks so they can compute more data from the CPU at the same time. The more data it processes the lower the amount of time it is waiting for the next set, which therefore boosts FPS.

Hopefully that's easy to follow. And also, currently there are 4 games, 2 rumored, to support DX12. The 2 rumored ones are Witcher 3 and Batman.. now with Batman undergoing serious changes doubt it will happen. However the other 2 confirmed DX12 titles are the next Dues Ex by Sqaure Enix and Ashes of the Singularity by Oxide Games and their in house Nitrous Engine built with DX12. DX12 just came out, it'll take about until next year before we get a lot more games supporting it.
 
Last edited:
I'm not the most qualified person to talk about APIs, however I should clarify a few things that I do know.

Before I start with the longer part, should note that porting over to DX12 isn't that hard and really depends on how the current game's pipeline was created(iirc from SPS our very own Game Dev)

A proper DX12 integration would most likely involve a refactor for most rendering engines (unless they were designed with DX12/Vulkan in mind). Simply porting a DX11 engine would not get the best performance from DX12, Microsoft said it might actually decrease.

Also potentially any future UE4 game could support it.
 
I keep feeling that dx12 will not be the magical thing so many believe it to be. Firstly game developers will have to implement it, secondly, how many dx12 games are on the horizon? So far just one. I predict that the GPU will remain the thing to offload tasks to rather than the CPU, makes more sense IMHO. By the time that those few rare dx12 games utilising more than 4 cores games arrive Skylake-E has arrived. Meanwhile Haswell-E owners might be playing games with 2 cores and twelve threads just sitting there and lower clocks, thus actually losing performance a higher clocked quad core offers. Hence I feel that Haswell-E remains a workstation chip and not a gamer's choice. But hey, maybe I'm wrong :)

It won't be magical. By the time you add in the improved graphics and so on the FPS will no doubt end up around the same, but, it's a start :)

I just want to see these massively cored CPUs being used properly ( I'm a bit of a core whore).
 
No dude and for now Im not likely to bother with them either tbh

Yeah, I'm not seeing the point of the whole DDR3L thing. If it was a thing where straight DDR3 could be migrated, it would make some sense, but people have to buy DDR3L just like they will have to buy DDR4. There is no point in the DDR3L without normal DDR3 compatibility.

Posts merged - Please do not post multiple times in a row

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Core i7-5820K 3.3GHz 6-Core Processor (£293.00)
CPU Cooler: Corsair H100i GTX 70.7 CFM Liquid CPU Cooler (£92.63 @ Amazon UK)
Motherboard: ASRock Fatal1ty X99M Killer Micro ATX LGA2011-3 Motherboard (£185.72 @ Amazon UK)
Memory: Corsair Vengeance LPX 16GB (2 x 8GB) DDR4-2666 Memory (£102.75 @ Ebuyer)
Storage: Samsung SM951 256GB M.2-2280 Solid State Drive (£119.10 @ Scan.co.uk)
Storage: Samsung 850 EVO-Series 1TB 2.5" Solid State Drive (£259.99 @ Amazon UK)
Case: Corsair Air 240 MicroATX Mid Tower Case (£70.58 @ More Computers)
Power Supply: Corsair RM 1000W 80+ Gold Certified Fully-Modular ATX Power Supply (£124.98 @ Novatech)
Total: £1248.75
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-08-07 21:50 BST+0100

I'm trying to decide between that Corsair RM1000 or an AX760

At most I'll be putting Fury Nano's in crossfire (fingers crossed)..Would 760W be enough considering I'd be overclocking the 5820k or should I go safe and go 1000W?



There's always that risk with any new tech though I guess.....I can't find any clear proof that DirectX12 will use CPU's better...If I look at current games such as Battlefield, Witcher 3, GTA, they all seem to use multiple cores....The only downsides I'm seeing of X99 so far is power consumption. Any more I should be aware of?

Definitely go with the better power supply. If you OC the chip, and run the crossfire, you'll need it. I always tend to go a little overkill on mine to keep the wattage as close to the sweet spot as is possible within my budget. The sweet spot being in the 40-60% range of rated wattage. You can get away with less, but efficiency drops the closer you get to 100%, and you stress the components more, thus shortening the life of the PSU.
 
Last edited by a moderator:
Yeah, I'm not seeing the point of the whole DDR3L thing. If it was a thing where straight DDR3 could be migrated, it would make some sense, but people have to buy DDR3L just like they will have to buy DDR4. There is no point in the DDR3L without normal DDR3 compatibility.

I think its just there to soften the blow of the DDR4 prices.

Definitely go with the better power supply. If you OC the chip, and run the crossfire, you'll need it. I always tend to go a little overkill on mine to keep the wattage as close to the sweet spot as is possible within my budget. The sweet spot being in the 40-60% range of rated wattage. You can get away with less, but efficiency drops the closer you get to 100%, and you stress the components more, thus shortening the life of the PSU.

Cheers.

I've got about 4 builds I need to decide between..The only thing thats consistent in them is the M.2 Drive, Air 240 case...Thought maybe go i5 instead and then use the rest of the cash to fund a long term water cooling loop?
 
Back
Top