So my knowledge on silicon and the like is lacking so I'm curious.
If these do turn out to be k chips that lack the iGPU could this mean they run cooler and are potentially better overclockers because there is one less thing the chip is having to run?
So my knowledge on silicon and the like is lacking so I'm curious.
If these do turn out to be k chips that lack the iGPU could this mean they run cooler and are potentially better overclockers because there is one less thing the chip is having to run?
No. The chips with GPU features, are able to turn it of when not in use - so there would be no difference. At least as far as I have read today elsewhere, because I thought the same thing
No. The chips with GPU features, are able to turn it of when not in use - so there would be no difference. At least as far as I have read today elsewhere, because I thought the same thing
Well if they have fewer features they should be cheaper.
Though I'd imagine people aren't willing to pay much for the privilege of having iGPU so chances are we might see a shift towards GPUless CPUs at current prices and chips with iGPU going up...
If it is a k chip without an igpu then the complexity of the silicon goes down and has a smaller footprint. This should lead towards much higher yields as you can fit more chips per wafer and are likely to have more be viable.
While it is almost a third of the area, at ~150mm^2 it would make far more economic sense for them to just recycle CL dies with defects in the GPU area by fusing it off than creating a new GPU-less design unless they had essentially perfect yields with no defects.
In other words they are thinking it's "Kinda Funny" how many fools will buy it thinking they are worth the added cost as they will be marketed as better overclockers.
nVidia and Intel are both loosing there minds with pricing etc, but they still get bought by people, personally I won't be in the future unless there is a major performance improvement over AMD.
Ivy Bridge and earlier lines generally had at least one P model with the GPU fused off and either a clock bump or a TDP drop, it's only with Haswell that they stopped selling GPU-less mainstream parts(Though the desktop parts are still generally half disabled), and also stopped updating their GPU architecture around then. Presumably this is so they can start easing HEDT parts into their mainstream line ups without a notable losss of a feature some people quite like(For the encoder/decoder/quicksync bits of the GPU) incase AMD steps towards double digit core counts.