2500k not always enough for xf 79x0?

Master&Puppet

New member
I have just been looking around for confirmation regarding the parts for my next pc and having decided on 2x 7950s I did a quick googling to find out what cpus people were using.

What I found is an emerging consensus that a 2500k is beginning to bottleneck 2 overclocked 7970s in large multiplayer scenarios such as 64 player and bf3 and 25man WOW instances.

I've seen a couple of threads mentioning this now but here is one for reference. Scroll down to the bottom of the page to see afterburner reporting gpu usage in HT vs non HT on a 2600k:

http://www.overclock.net/t/1196856/official-amd-radeon-hd-7950-7970-owners-thread/2440

The usage is choppy and averages less with HT off which suggests that XF 7970s are beginning to be bottlenecked by an overclocked 2500k at 1080p on cpu intensive games.

The effect of the bottleneck is substantial but not so much that it would effect smooth gameplay.

2600k results were apparently averaging in the 90+ region.

2500k/2600k non HT results were dipping at times towards 50-60 or even down to 30 in certain 25 man WOW instances.

Interesting I thought...

M&P
 
Surprising but I can understand since the hyper threading is not enabled... Well this seals the deal for me 3770k here i come.

Thanks for sharing this.
 
Not a cat-in-hells chance.

Online is the issue.

You can have a quad sli 4g 680 setup and you'll get the same thing, and a twin 50 core xeon based mobo.

Neither WOW or BF3 are cpu intensive (what I'd call intensive anyways), according to the tests I did last week and the G15.

(no more bottlenecks please)
 
sorry rasta, not often i do this but i dont agree with you mate.

every body knows that when the cpu gets bogged down it will starve the gpu (just search f@h forums) and when you have a big raiding party in wow or are on a big 64 slot server on bf3 your cpu does take a hammering.

that being said we are also taking about running 2 pci e3 cards on half speed pci e 2 slots which i think might also have some impact upon it.

fyi i dont value any gpu usage screens which dont show cpu usage at the same time.
 
And did they arrive to that conclusion, I have seen MW2 drop to 10fps on a i7 990X 3x GTX 580 rig when playing online. It has more so to do with bandwidth than CPU intensity.

If your a online gamer. Ditch the 7950s and get a 7870 or 7850 and just lower either the res or textures. IT WILL IMPROVE FPS AND OVERALL GAME PLAY. Less work for everybody means higher effieciency.

Kazri
 
sorry rasta, not often i do this but i dont agree with you mate. every body knows that when the cpu gets bogged down it will starve the gpu (just search f@h forums) and when you have a big raiding party in wow or are on a big 64 slot server on bf3 your cpu does take a hammering. that being said we are also taking about running 2 pci e3 cards on half speed pci e 2 slots which i think might also have some impact upon it. fyi i dont value any gpu usage screens which dont show cpu usage at the same time.

The delay everyone from an AMD 64 single core user (probably too much of an extreme) to an Intel 12 core, with either onboard graphics (perhaps thats too much of an extreme also) to a best-gpu-setup-in-the-world-ever, will experience during online gaming is the ping.

Now to stop there and explain. It's not just your ping, it's also the multiple ping times of all the other users to the server your using, the syncing, and then to you. You could have a 5ms ping, but you have to factor in the others. Each of these informations is requested to get back to you during a frame, which will almost never happen. Vsync off is generally a good idea when playing onine.

In a free world environment online game (mmorpg or whatever), running along happily by yourself, you will get the best fps. When you encounter another user entering you proximate playing area, even 1, your computer will ever so slightly pause. With some online games where they treat mobs as players, the same happens when they spawn.

The calculations, in the newer game sense, of the other players/mobs actions within the environment, are done server-side in the majority. Your computer, for all intensive purposes, is giving you an "offline" experience, with the addition of non-AI objects to which it has to seek information from the server to know about them. This is where the multiple pings of the other players take effect.

The meanest, most bad-ass built gaming computer will do the same. As will the cheapest of budgets one. The base gameplay experiences are obviously different.

I keep the G15 showing cpu and main memory usage at all times. It's a habit.

What running the cpu at 100%, effectively running f@h or another cpu benchmark that uses maximum of the cpu and it's cores/threads, is to slow down Windows. This you would expect as taking up all the computers' processing power, then expecting it to do other things in addition, like handling alot of gpu work - folding or transcoding. (I folded for several years, gpu and cpu on dozens of rigs)

Neither of the games run the cpu at 100%, not even close. The best I've seen in any game other than Crysis2 beta (not evident in the released version) is 1 core producing 100%. If you have a 4 cores or threads, this will show as the cpu being used at around 25%. If you added another 4 cores it would hover around 12.5%. Better is a dual core that will use 50% (100 divided by the 2). But this is only a %age of what's available, so doesn't mean that 50% is to run better than a 25%. i.e. a Q6600 and a E6600 equiv. (the max 1 core value will fluctuate depending on what other programs, including Windows, are doing in the background. What we keep in mind here is that if the game is using a core/100 %age of the cpu, the balance of the cpu is used for everything else, through Windows.

What dismisses totally the thought of any cpu effecting a gpu setup's performance is to run 2 tests. 1 an offline game, and 2 an online game. If there is no hold back (which there won't be) on the offline game, but in the online there are issues - the answer is there. The difference is with you being online.

I've been through and studied the effects of online gaming, on much bigger scales than wow instances and bf3. I've seen the effect of having a 300000 3dmark11-esq bad ass computer suddenly get 1000s of players jump through a portal during wars... it grinds to a stand-still. In alot of cases the game will drop you out.
 
Interesting. Not being a massive online gamer I have never considered that everyone's fps would be slowed by networking.

The part of that thread that interested me was that the only apparent change was turning on or off HT but of course there is no accounting for heavier network usage and yes it would have been useful if there was more info beyond the AB graph. But that's why I posted it.

As for the 'no more bottleneck threads' and '/close' posts. Well lads Im afraid they will happen everytime a new cpu or gpu is released. Thanks to eveyone who has contributed so far (Pexon - your post could have been more useful dude).

I believe that the main reason these threads keep getting written is because they haven't been addressed properly or aren't signposted clearly enough for people to find.

I would like to see a graph which cross referenced benchmarks for the amount of data a cpu can push out compared to the amount a gpu/multiple gpus can absorb. It's more work than saying 'just get a 2500k' but will no doubt help a lot of people and hopefully reduce the number of repeatitive new build posts and incidently this type of post I've just made lol.

Yes I'm hard to please! And I'm sure that is more effort that most people need.
 
Master&Puppet;1334674212 said:
I would like to see a graph which cross referenced benchmarks for the amount of data a cpu can push out compared to the amount a gpu/multiple gpus can absorb. It's more work than saying 'just get a 2500k' but will no doubt help a lot of people and hopefully reduce the number of repeatitive new build posts and incidently this type of post I've just made lol. Yes I'm hard to please! And I'm sure that is more effort that most people need.
Someone did a test similar to that - but it was soooo long ago that it's too outdated to refer to. I think was one of the only interesting reads I got out of Tom'sHardware at the time. The idea was good, the tests were good - but from what I made of it, and apparently what a lot of other people got out of it, were 2 different things. Essentially, it was along the lines of taking kit like a Q6600, a 8800Ultra, and recording bench results from an underclocked cpu (i.e. using another cpu for arguments sake), stock gpu, then overclocking the gpu. Then change the cpu clock, repeating and doing it all the way up to it's max oc and the gpu's max oc. Lots of pretty graphs etc etc.. (I'm sure it was Tom's) Anywho, when you looked at the final graph, it showed the cpu speed versus gpu bench, going up uniformly... up to around 3.6ghz, at which point the additional gpu benchmark points started to trail off... up to about 4.05ghz (which I think was the max he could do) where there was just no gpu improvement at all. What many read into this was "even at 4ghz, the Q6600 can't perform quick enough to keep up with the gpu" - "Bottleneck !!! Blellttleneecckk//AA!!/??pk!!..". What it showed really, was that if you got a Q6600 up to 3.6ghz, there wasn't ~really~ that much point in going further as the rate of performance increase was negligable. I think the test guy even printed those words in his conclusion. Nevertheless - Bottleneckz0rs!! Ofcourse, we're miles away from that tech now and could do with a refresher. You volunteering ?
wink.png
 
Excuse the punctuation in that. For some reason when I go to edit on this forum now in windows, it turns into a basic text thing and mashes up all the formatting. Very annoying JIM !

(I'll sort it out on the mac later probably)
 
Ofcourse, we're miles away from that tech now and could do with a refresher. You volunteering ?
wink.png

Actually that is exactly the kind of thing I would do! I did a topic a few weeks back on 6 vs 4 core gaming - that's where I had the idea for this graph orginially.

I'm not exactly sure how I would do it since I'm not familiar with benching. I would have to find a test which scores the gpu output in as real a gaming environment as possible and which measures the overall gpu output as well as cpu input discretely yet also limits the gpu score by the cpu score. Does that make sense?

I'm thinking 3D mark vantage. I believe that the cpu score will affect the gpu score so:

If I underclocked a cpu I would see a reduction in the cpu score but not in the gpu score until the cpu became a bottleneck and mark that.

Furthermore to make it as real-world as possible I'm thinking a top down approach:

1. Take a performance standard - say average of 60 fps in the most modern/popular games.

2a. Use 3D mark vantage and overclock/underclock the gpu to suggest a gpu score which is needed to achieve point 1.

2b. Suggest gpus which are able to reach this score and at what clock.

3a. Without touching the gpu, underclock the cpu until the score in 2a is not achieved (even though we know the gpu is capable of it).

3b. Suggest cpus which are able to reach this cpu score in 3a and at what clock.

Initially I'm thinking 1080p max in game settings but that can be expanded to include lower settings and multiple screens.

How does that sound?
 
sorry rasta, not often i do this but i dont agree with you mate.

every body knows that when the cpu gets bogged down it will starve the gpu (just search f@h forums) and when you have a big raiding party in wow or are on a big 64 slot server on bf3 your cpu does take a hammering.

that being said we are also taking about running 2 pci e3 cards on half speed pci e 2 slots which i think might also have some impact upon it.

fyi i dont value any gpu usage screens which dont show cpu usage at the same time.

I disagree a 64 man metro rush server on bf3 with maxed settings except 2x AA keeps all four cores of my stock 2500k (3.3ghz-3.6 turbo) at about 75-80% load across all four cores I only have a gtx 570 superclocked version.

Also, I don't really think a 2500k will "bottleneck" two 7950's I've seen dual 580's sli on a 2500k however dropping 800$ on gpu and spending 200$ on a cpu might get you some bashing but you won't take a hit in performance imo
 
Hmm that's interesting, I didn't think a overclocked 2500K would bottleneck two high end cards...

It probably doesn't - I think I was sent on a ruse because of network speed during online game.

Looking to order the gpus for my next build today and the whole rig should be running in a couple of weeks then I'll start making these graphs to put numbers to arguments!
 
Cool, Cheers KoS, you have always said that the 2500k is more than enough and I do agree with you even if I seem to challenge it quite often!
 
Just fyi, I never see my cpu get passed 40% in BF3 in a 64 player match. 2500k 4.6ghz.

That'd be around what I'd expect tbh. 40% ceiling, game+OS+blah blah. Game possibly 25% max and 15% for whatever else.

I'm gaming on a 2600k @4.5 for now, which for all purposes is in the same ball park.

I disagree a 64 man metro rush server on bf3 with maxed settings except 2x AA keeps all four cores of my stock 2500k (3.3ghz-3.6 turbo) at about 75-80% load across all four cores I only have a gtx 570 superclocked version. Also, I don't really think a 2500k will "bottleneck" two 7950's I've seen dual 580's sli on a 2500k however dropping 800$ on gpu and spending 200$ on a cpu might get you some bashing but you won't take a hit in performance imo

Hmm if I found a game running 75-80 across all cores, I'd be looking at what issues I got. I'd be like folding whilst gaming. And to be fair, gaming doesn't use a fraction of the processing that something like folding does.
 
my 25k sits around 65/70% while playing too but i, like lysol above, only run 1 g card which makes my stats pointless for this matter.

what i do know is a mate who also plays bf3, and has a multi gpu setup has said that he gets (big) drops in fps when he turns off ht on his 27k. his 27k does 5ghz easy with ht off but only 4.8ghz with it on (limited to daily use voltage) but with only 4 cores at 5ghz he gets drops which are not there with ht on.

this is an issue/topic that is coming up more and more across the web on different forums, and tbh the more people that are saying they are being held back by their cpu the more weight it has to have.

wasnt so long back everybody was, amd x2 is all you need for gaming and then out came the 88gtx which would be sat idle waiting for that cpu so everybody had to go conroe to feed that gpu, history always repeats itself and its quite possible that the new gpu out today are pushing these cpu to their limits. maybe its the limits of pci e 2 x8 (x2 for both cards) maybe its some kind of throttle from the drivers who knows but there is very often fire when you see smoke.

now if only we had someone with access to a few gpu and a couple of cpu config setups to see how much performance you get from the same gpu config on each setup :|
 
my 25k sits around 65/70% while playing too but i, like lysol above, only run 1 g card which makes my stats pointless for this matter.

... I dunno so much.

The more information you can get, from the higher end to lower end, mixed setups - the more you can prove theories. Even if, like as you said, you have a 1g card - perhaps that's why your cpu is using more than others might ? (it's a guess, I don't know) But let's say you play a game that demands over 1g, all of a sudden Windows decides you need to start using onboard memory to compensate. The cpu will have to take ~some~ kind of hit (maybe) to act as a liason between the gpu and mainboard memory ? Stick in a 1.5g card and suddenly you're down to 30/40% max when gaming.

That's just a wild theory thrown out as an example. No truth in it as far as I know. But without a test, who knows for sure.
 
what i do know is a mate who also plays bf3, and has a multi gpu setup has said that he gets (big) drops in fps when he turns off ht on his 27k. his 27k does 5ghz easy with ht off but only 4.8ghz with it on (limited to daily use voltage) but with only 4 cores at 5ghz he gets drops which are not there with ht on.

this is an issue/topic that is coming up more and more across the web on different forums, and tbh the more people that are saying they are being held back by their cpu the more weight it has to have.

If that was a observation during online play then network and coincidence might explain it? Would be great to hear if that happens in single player.

In any case this clearly needs clearing up. Some people are reporting very little 2500k usage yet others are seeing falling fps without ht...

I'll be happy to start a community based benchmarking thread to A. Clear this up and B. Give people facts to aid gpu + cpu selection for new builds.
 
Back
Top