Intel CEO wants to see the industry shift its focus away from "benchmarks"

Tbh I think the problem is more that Intel used to be so obsessed with specific unrepresentative synthetic benchmarks, than that they are now opposed to them, at least as long as this is a genuine shift and they remain opposed to them even when they put them ahead.

But saying that, marketing departments will use anything they get, and they're used to working with and pumping out much flimsier things than some somewhat unrepresentative benchmarks, so yeah I doubt this will last as a whole-company message thing.

But still, shifting resources towards optimising actual real world applications is always a good thing imo, many top benchmarks are pretty optimal in efficiency and their use of new instructions, but for them to be representative applications need to do the same. (And AMD will get a lot of this optimisation done for free in gaming because of their design wins)
 
Last edited:
LMAO @Intels attitude now.
Benches were fine when they were at to top, giving tiny increments of performance with iteration over a decade.
suck it up buttercup
 
Tbh I think the problem is more that Intel used to be so obsessed with specific unrepresentative synthetic benchmarks, than that they are now opposed to them, at least as long as this is a genuine shift and they remain opposed to them even when they put them ahead.

But saying that, marketing departments will use anything they get, and they're used to working with and pumping out much flimsier things than some somewhat unrepresentative benchmarks, so yeah I doubt this will last as a whole-company message thing.

But still, shifting resources towards optimising actual real world applications is always a good thing imo, many top benchmarks are pretty optimal in efficiency and their use of new instructions, but for them to be representative applications need to do the same. (And AMD will get a lot of this optimisation done for free in gaming because of their design wins)

Part of Intel's success was that they were good at everything, even those niche workloads. Even today, there are workloads that Ryzen isn't as good as Intel at. So yes, Intel push to be good in niche use cases, it did work out for them.

As far as Intel's real-world strategy goes, I feel that they push the benchmarks that suit them. It was argued that CineBench R20 wasn't useful as not many people use Cinema4D, but then they pushed Matlab 2019, a software which also has a small (albeit dedicated) userbase. The funny thing was that Matlab 2020 enabled AVX support for AMD CPUs and took away Intel's advantage.

When Intel moves from Skylake, it will be a BIG jump, but a lot of that is because Intel has had to postpone 5 years of architectures over their 10nm kerfuffle. That's a lot of time to work on IPC increases and new features.

While focusing on impact is a good thing, looking at niche applications is important too. Reworking one area could benefit multiple other workloads, and boasting a 20+% gain in a specific workload is a big deal for certain markets.

TBH, I think Intel needs to push AVX-512 onto its mainstream desktop processors. That way, they can get software developers to utilise it and give Intel a significant performance advantage across applicable workloads. AMD can't match that right now, but the longer they wait, the more time AMD has to catch up.
 
Yeah this would definitely need to be a move away from cherry picking in general to warrant praise, if it's just a switch from cherry picking benchmarks to cherry picking applications ofc that's no more useful.

If it's a move to assessing a wider range of applications and looking for weak points in order to address them architecturally or in compilers or coding techniques though, rather than finding weak points in order to ignore them and drown out with marketing, then that'd be (is?) a positive step.
 
So Intel has run out of straws to clutch at...

Highly questionable BS excuse on their part I feel, perhaps they aren't so confident in future products after all.
 
So Intel has run out of straws to clutch at...

Highly questionable BS excuse on their part I feel, perhaps they aren't so confident in future products after all.
More than likely, if the tables were turned Intel would be screaming benchmarks from the rooftops
 
What's funny is that most "benchmarks" are not synthetic or unrealistic any more. They are demonstrations of a fixed test in a piece of software you would use every day. Blender, Handbrake, games etc. Just because a game has a built in benchmark it doesn't falsify facts like Intel are whining about. In fact, many reviewers have their own renders they created to make sure it was accurate.

I'm not sure how they think they can get around losing like this, because no matter what people use they are going to lose any way because their technology is inferior.

I think they are confusing benchmarks with making a product look bad because, well, you know? it is.
 
What's funny is that most "benchmarks" are not synthetic or unrealistic any more. They are demonstrations of a fixed test in a piece of software you would use every day. Blender, Handbrake, games etc. Just because a game has a built in benchmark it doesn't falsify facts like Intel are whining about. In fact, many reviewers have their own renders they created to make sure it was accurate.

I'm not sure how they think they can get around losing like this, because no matter what people use they are going to lose any way because their technology is inferior.

I think they are confusing benchmarks with making a product look bad because, well, you know? it is.
So true
 
What's funny is that most "benchmarks" are not synthetic or unrealistic any more. They are demonstrations of a fixed test in a piece of software you would use every day. Blender, Handbrake, games etc. Just because a game has a built in benchmark it doesn't falsify facts like Intel are whining about. In fact, many reviewers have their own renders they created to make sure it was accurate.

I'm not sure how they think they can get around losing like this, because no matter what people use they are going to lose any way because their technology is inferior.

I think they are confusing benchmarks with making a product look bad because, well, you know? it is.
Oh yeah definitely, but that's somewhat besides their point, which seems from the quotes is not that they don't accept benchmarks results, but that they want to shift away from using benchmarks as a primary guiding influence for their hardware and compilers.

Intel has famously historically spent a lot of money optimising their compilers to maximise benchmark results, and if they can see it's more useful to shift that money just to optimising for the applications then in theory it means better end user performance across a wider range of tasks. (Of course, while many benchmarks can be derived from real applications, the focus from a design/development perspective should still be on optimising a range of applications as opposed to just the workloads with popular benchmarks derived from them, because two pieces of software designed to do the same thing can take wildly different approaches)

In laymans terms, they seem to be claiming they're going to be spending less engineers time and R&D money on "marketing performance" and more on end user performance, if what they're saying is true.
 
Last edited:
Not to take anything from Ryzen CPUs, they are amazing but to some degree, he has a point.

For example, everyone is hyping AMD CPUs for Blender and other 3D modeling/rendering apps. That is wrong. Yes, they are good for rendering but you don't render with CPU. 2080 Ti with OptiX beats 3970X hard for half the price. Viewport performance is much better on Intel CPUs because it is single-threaded. So Cinebench, V-Ray, Corona, etc benchmarks are completely misleading and pointless.

Video editing again is dependant on application. Resolve cares more about GPU than CPU.

Everyone is talking about how AMD CPUs are good for streaming. Again wrong. You don't stream with CPU you stream with NVENC (new) without any drop in performance.

The real answer is: "It depends."

Edit: Gen 4 NVME performance is completely misleading. Samsung 970 Pro Gen 3 NVME beats all of them in real-world tasks quite heavily. Intel 905p with half "rated speeds" beats everything full stop.
 
Last edited:
Not to take anything from Ryzen CPUs, they are amazing but to some degree, he has a point.

For example, everyone is hyping AMD CPUs for Blender and other 3D modeling/rendering apps. That is wrong. Yes, they are good for rendering but you don't render with CPU. 2080 Ti with OptiX beats 3970X hard for half the price. Viewport performance is much better on Intel CPUs because it is single-threaded. So Cinebench, V-Ray, Corona, etc benchmarks are completely misleading and pointless.

Video editing again is dependant on application. Resolve cares more about GPU than CPU.

Everyone is talking about how AMD CPUs are good for streaming. Again wrong. You don't stream with CPU you stream with NVENC (new) without any drop in performance.

The real answer is: "It depends."

Edit: Gen 4 NVME performance is completely misleading. Samsung 970 Pro Gen 3 NVME beats all of them in real-world tasks quite heavily. Intel 905p with half "rated speeds" beats everything full stop.


Very informative and constructive post that is succinct, you've even cleared up some information I was unsure of.

For me it comes down to the blend of hardware (best optmised) for my needs and it leads back to your post, thank you!
 
I actually stopped caring about benchmarks long ago.

The only ones of value to me now are 3D mark for the CPU, GPU combination I have, and literally FPS in games.

Blender, Cinema etc are so fast now, that I could not give one piece of care for a cpu that saves me 10sec in render time. Whoop Whoop 10sec of my life to do something else.

When I have software so taxing that every second is vitally important then maybe it would be considered. But right now, I only care about the cost. And that is why im still rocking a highly overclockable 7700k. The most worth while upgrades are my GPU, at least until it gets B.necked.
 
I actually stopped caring about benchmarks long ago.

The only ones of value to me now are 3D mark for the CPU, GPU combination I have, and literally FPS in games.

Blender, Cinema etc are so fast now, that I could not give one piece of care for a cpu that saves me 10sec in render time. Whoop Whoop 10sec of my life to do something else.

When I have software so taxing that every second is vitally important then maybe it would be considered. But right now, I only care about the cost. And that is why im still rocking a highly overclockable 7700k. The most worth while upgrades are my GPU, at least until it gets B.necked.


Agreed. I'm rocking an 8700k at 4.6ghz. more than enough for even a 2080ti

Idk about you but I'm personally waiting for the next cpu platform that has ddr5 memory as standard. Which will probably be Zen 4. I feel that's the most worthwhile upgrade you can make the for the money at this point in time. Until then a better GPU is only way to make meaningful performance gains
 
i feel like everyone is slowly converging on the same argument which essentially boils down to... 'horses for courses'
 
Back
Top