Intel gets honest about 10nm; it will be less profitable than 22nm

Just makes AMD and Jim Keller look even more impressive IMO.

This has more to due to fabs than Intel vs AMD. This is just a pure mess up on Intel. They have spent so much money on time they cannot possibly make as much money as they wanted. It is a very bad ROI for them.
 
This has more to due to fabs than Intel vs AMD. This is just a pure mess up on Intel. They have spent so much money on time they cannot possibly make as much money as they wanted. It is a very bad ROI for them.

It's a combination of everything dude. The whole architecture is a huge problem for Intel. AMD have shrunk what? twice? in just over a year? That's pretty incredible really and really says something for their whole design and architecture.

There's a reason why they hired Keller..... (Intel I mean).

I suspected clock speeds was their Achilles heel, and the facts seem to bolster that.

It's kinda funny that once AMD get to Zen 3 Intel are going to be in the dust somewhere.
 
Exactly, If Zen never released we would already have desktop 10nm parts the top end consumer part would be a 4 core 8 thread i7 at 4ghz or so.
 
The worry I see in the future now though is with Intel stagnant, AMD will surely rise and dominate. However, if there is no pressure or competition from Intel, then AMD will no longer feel obliged to push the limits of Zen, so our market will once again be incremental increases per generation. Only this time it is AMD with all the cards in hand.

As much as we all dislike Intel's practises. We cannot dispute that their architecture was good (excluding security flaws). We need Intel to continue their push for market domination in order to keep all parties competing.
 
The worry I see in the future now though is with Intel stagnant, AMD will surely rise and dominate. However, if there is no pressure or competition from Intel, then AMD will no longer feel obliged to push the limits of Zen, so our market will once again be incremental increases per generation. Only this time it is AMD with all the cards in hand.

As much as we all dislike Intel's practises. We cannot dispute that their architecture was good (excluding security flaws). We need Intel to continue their push for market domination in order to keep all parties competing.

Intel can afford this stagnation AMD can't. AMD just cannot sit and enjoy. They have made a scratch in Intel's armor and provoked a response other than "increase prices". Intel will be back firing from all guns and for that moment AMD needs to be as far as they can with their development. There won't be a stagnation from AMD.

Intel is still ruling the mobile market. And with the recent price drop in the server market, Intel is still a really good option. Even though Intel is still a bit more expensive in price/core(performance) you still get Intel support staff with their products. And in the server market that is worth your weight in gold.

Who will win in the next 2-5 years it really doesn't matter. What matters is that there is progress, and you have a choice. 3 years ago you didn't. We were stuck on 4 cores for mainstream for almost a decade, and we would be still if not for AMD. Because of AMD we now have 16 cores on the mainstream platform in just 2 years. And pretty much infinite cores on HEDT. All applications are getting reworked for muli-core support because of AMD. Intel is dropping prices because of AMD that alone is the miracle.
 
Last edited:
The worry I see in the future now though is with Intel stagnant, AMD will surely rise and dominate. However, if there is no pressure or competition from Intel, then AMD will no longer feel obliged to push the limits of Zen, so our market will once again be incremental increases per generation. Only this time it is AMD with all the cards in hand.

As much as we all dislike Intel's practises. We cannot dispute that their architecture was good (excluding security flaws). We need Intel to continue their push for market domination in order to keep all parties competing.

To be honest the software desktop market has a lot of catching up to do before any of that will be a problem.

Of course during this time Intel need to remain competitive, and they have already dropped prices. Nothing AMD do will ever make Intel go away. Ever. So there will be competition there at least until Intel scrap their current tech and move on and I can't see them doing that any time soon. So they may have to increase socket sizes in desktop boards to fit more cores, but at the end of the day they'll do what they have to.

What I am saying is I can't see any of us saying "Oh crap 16 cores is not enough I need more in a desktop board now !" for a very long time.

The consoles have what? 8 CPU cores? then that is what you will need for gaming until the next gen 5 years or so from now.
 
I'd agree on the core side of things I felt 8c/16t is plenty is why I got first gen ryzen it's enough i feel and it was a far cheaper upgrade.

I think Intel is going to be fine long term, the issue I feel in mainly the consumer side of the market is AMD are selling chips gaining share and Idk about most people maybe I'm wrong but in my own view when you upgrade it's for the next serveral years waiting for the performance leap. So regardless of how either of the companies stay comeptitive once sales are made them users wont generally upgrade for 4 years, unless they are always upgrading and sell on old hardware sooner.

I think where Intels biggest issues are isn't the CPU side that will bounce back sooner, but there current efforts on GPU side I'm unsure about seems longer away before they are in line with nvidia or amd they have the resources to bring something good out just not sure the implementation will be anything decent for a long time.
 
CPUs have never really been interesting since GPUs got stupid powerful many years ago.

Like back in the mid to late 90s when the MMX technology came out when supported it was much faster than a regular Pentium in games, but since then no tech on CPUs has really helped games apart from just usual CPU improvements.

It's only recently 4 cores aren't enough for high end gaming any more. That was a standard for over a decade though.

But yeah, 8 cores is more than enough for now and you can get them really cheap. Esp something like a Ryzen 1700.
 
yep 1700 is what I have 4Ghz doesn't break a sweat tbh and I find compared to higher clocked 4 core the difference when gaming streaming watching a stream and music all at the same time is where the biggest noticable difference is switching about is a hell of a lot smoother in every way.
 
yep 1700 is what I have 4Ghz doesn't break a sweat tbh and I find compared to higher clocked 4 core the difference when gaming streaming watching a stream and music all at the same time is where the biggest noticable difference is switching about is a hell of a lot smoother in every way.

It will be.

Something reviewers and sites never ever seem to cover is multi tasking. When I get obsessive on my PC and end up running about ten apps I can really notice a difference. I used a quad core CPU last year for about two weeks. It wasn't even bad (Haswell) but it just fell apart as soon as I started cranking on it. And that was just at desktop level.

I guess I use my machine more as a workstation and so on more than gaming. In fact that's a fact, as I am usually designing and drawing stuff and then running more programs to cut it on the fly as I test sizes and so on. And that's why I've been on lots of cores since Windows 8 because I can really tell the difference.

I'm not fussed about the fastest gaming experience either. As long as I can get my 70hz monitor to sing I am very happy. I know quite a few guys who bought 2080Ti and then sold them and got a 2070 because it was more than enough.

I don't "FPS watch" it's about as boring and distracting as train spotting. I know when a game isn't running well I don't need a counter to show me lol.

It's been a very long time since I had a bad experience gaming. Probably Titan Black in SLi and then Crossfire on two Fury X. That was bad, like really bad. Trust me to stay on SLi just as it was dying then switch to CFX which was worse.
 
People are so damn hooked on the 5ghz concept, I wonder why when it comes to gaming.

There is almost nothing gained over 4.4ghz. Except hotter and hungrier CPU. It's all about the GPU these days, and we can see that a 2080ti will still not bottleneck a 7700k at what I liked to consider as the new mainstream resolution of 1440p. And the 7700k is a few years old now.

4ghz with extra cores is the direction we should be going.
 
People are so damn hooked on the 5ghz concept, I wonder why when it comes to gaming.

There is almost nothing gained over 4.4ghz. Except hotter and hungrier CPU. It's all about the GPU these days, and we can see that a 2080ti will still not bottleneck a 7700k at what I liked to consider as the new mainstream resolution of 1440p. And the 7700k is a few years old now.

4ghz with extra cores is the direction we should be going.

I know that, you know that, Intel have been using it as a selling point for years.

So I guess it serves them right for depending on it when they should have been pulling their finger out of their arse and pushing on.

I hold the second highest benchmark score for the 2070 Super on OCUK and my CPU runs at around 2.4ghz.
 
Higher clockspeeds do matter for gaming. 5ghz can and will still see a benefit.

The issue is games are far more graphically intense (generally) and therefore throwing more CPU power does next to nothing but increase multitasking purposes(like using a browser) etc but thats not worth the power cost.

In games like Total War higher clockspeeds matter a lot. Since that game fails at taking advtange of more than essentially 4 fully used threads.
 
Back
Top