Bring me up to speed - Socket types & CPUs

On the point of the 8350 that I mentioned earlier, the information i was referring to was the benchmarks that the TekSyndicate did. I don't know for sure if theirs were conducive of other benchmarks and experiences but from theirs, the 8350 performs extremely well.
 
On the point of the 8350 that I mentioned earlier, the information i was referring to was the benchmarks that the TekSyndicate did. I don't know for sure if theirs were conducive of other benchmarks and experiences but from theirs, the 8350 performs extremely well.
This one?

http://www.youtube.com/watch?feature=player_embedded&v=eu8Sekdb-IE#!

Yea I dunno. I get a bit frustrated with a lot of CPU/GPU reviews from all sources really. I'm not going to start pulling this particular one apart for what I see as its faults alone because I do want to mention the problems that many reviews share:

1. Not enough testing across a spread of resolutions. In the case of that review above they were benchmarking at 1080p/1440p on ultra graphics using a 7870 (I'm not aware of any mention of an overclock). Now frankly I'm surprised that they recorded any differences at all because the FPS must be almost entirely limited by the GPU. Take the load off the graphics and you'll begin to see more accurate differences.

2. Reporting average FPS is such a basic measurement tool that it barely explains anything. Min/Max frame rates are barely better because you are still making a load of assumptions. There are tools out there which allow you to log and monitor individual component usage even across individual cores. That gives a much better picture of what's really going on.

3. These CPUs are designed for overclocking. So where is the mixture of clocks? There should be tests ran at stock, an apples-to-apples equal clock test and one more with each at their respective maximum overclocks.

4. Tests conducted through online gaming are too unreliable because they rely on the speed of the internet/server connection at the specific time of the test. E.g. if you want to test BF3 online then you need to report specific component usage to account for the difference.

That's a few for now...the problem is that all of that is a lot of work and often these reviews are bought out on deadlines and the rest of use don't have the resources to do it that thoroughly. Having said that - this might change for me soon so I might be able to have a go at this myself.
 
Thats rubbish mate, no games use more than 4 cores, and as clock for clock the 3570k is better this means that in games it is better, would you like proof?

Myth although the first couple are utilized more then the rest by design because of the Laptop market.
 
If you are in need of a toaster, buy AMD. If you like high electricity bills, buy AMD. If you like to make your own croutons, open a loaf of bread and put it in an AMD system.

The power draw and heat of AMD is second to none, whilst leaving much to be desired in gaming performance, and the 3570k is only about $6 more.

AMD is only 'worth it' if you do lots of compiling/decompiling of code and/or rendering/editing/transcoding of video. In the long rund you would still be better off spending the extra $100 to get a 3770k which results in lower monthly power bills.

Have you got a budget for this upcoming build?

Since when is AMD at the forefront for global warming?

Heat on AMD is not that bad, and power draw is down to your wallet, would you say a guy with an i7 and 3 680's is green?

Sheesh...





Power draw, wow 100 watt, is your PSU gonna explode, and break the nuclear reactor?

51142.png
 
Last edited:
Yea the power thing isn't a deal breaker really unless you fold or work your CPU permanently at load (and of course pay the bills). It's simply down to the fact that AMD is still on the 32nm whilst Intel jumped to 22nm.

Taking 30 hours of 100% load a month at 15p/kW and that 100w = 45p a month. So that Amd system costs about £6 a year more to run than the 3570K.

^ lmao
Better for gaming
Ok - I'm playing devil's advocate here and I don't have an issue with being wrong so prove me so...
Have a look at the reviews here...:

http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/6
http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/5
http://www.xbitlabs.com/articles/cpu/display/fx-8350-8320-6300-4300_6.html#sect0
http://www.tomshardware.com/reviews/fx-8350-vishera-review,3328-13.html
http://www.techpowerup.com/reviews/AMD/FX-8350_Piledriver_Review/6.html
http://www.behardware.com/articles/880-13/amd-fx-8350-review-is-amd-back.html

And then explain why we should believe that one video review when all of the others say the exact opposite?
 
Last edited:
Here is a CPU limited title.

skyrim%201680.png

skyrim%201920.png

skyrim%202560.png



Now, i am one of those that likes to be on top of my game, and many casual people stay at 60hz....


The FX pulls more than enough for a person with a 60hz screen, what on earth is wrong with some of you?
 
Yea I'm not saying it's a bad chip - it's significantly more recommendable than it's predecessor but that doesn't make it the most recommendable gaming chip. Neither am I suggesting that people should go out and buy i7s - that is a waste of money.

The evidence that I have seen simply suggests that today's games prefer fewer, more capable cores and that means the 3570k is the sweet spot. Often it won't matter, like in most first person shooters, but if you play CPU heavy titles, of which there are several very popular ones, then it really can make a difference. Heck maybe it is just me but when I spend £300 on a graphics card it seems really silly to then save £20 on the CPU and find that when I play Shogun 2 (which I play a lot) that my fps is flattened by lower capacity cores.

If I had bought a 8350 instead of the 3570K:
5je70Lz.png

TzsiQT0.png

UhimJeB.png
 
Last edited:
Here is a CPU limited title.

skyrim%201680.png

skyrim%201920.png

skyrim%202560.png



Now, i am one of those that likes to be on top of my game, and many casual people stay at 60hz....


The FX pulls more than enough for a person with a 60hz screen, what on earth is wrong with some of you?

Watch the video I posted he does skyrim tests. In every game either the FX is barely beat by the Intel but when the FX wins its a massiveeeee scale like 2x performance kind of scale.
 
Watch the video I posted he does skyrim tests. In every game either the FX is barely beat by the Intel but when the FX wins its a massiveeeee scale like 2x performance kind of scale.
Is there any other evidence other than this one rogue video? It really needs more to back it up...from other sources I mean.
 
Hi Guys,

Based on my ex-wife's simple needs (YouTube, Yahoo games, internet and email) I went along with the following simple and cheap setup, which is good enough for Windows 8.

Gigabyte H61M-HD2 Motherboard (1155)
Intel Core i3-3225 (Dual Core with HD Graphics 4000)
8 Gigs of DDR3 1600 ram
hec Win+ power LT 500W PSU (80 Plus Bronze)
Zalman Z9 Plus Case
I gave her one of my own 500 Gig 7200 Hard Drives.

She already had my old keyboard and mouse and a monitor.

I have to wait until the end of the month (sad but true) before I can get her a lower range Blu-Ray/DVD Burner, and a second Hard drive.

I managed to get her Windows 8 Pro Upgrade at the end of Janauary before the price went up, and upgrade her from Windows XP. (It's nice to know that so long as the upgrade or install sees the old OS at setup, it then offers to do the clean, complete fresh install. So none of that previous OS remains.)

I live in Japan, so some of the above might not ring any bells. All in all it cost (or will cost) about Y37,000 which is about 250 quid as far as the exchange rate goes. It works well too with Windows 8. I might be building the same build for a small school a friend has, with equally low spec requirements.

The case is nice and quiet. One of the case fans runs a little noisy running from the motherboard. If I plug it into the PSU fan connections it runs a little slower and therefore quieter. The BIOS seems to have a setting for silent fan running but it didn't make much difference. So I will have to wire the fan into one of the other PSU fan connections of which there are only 2. (There are 3 case fans.)

I'm looking forward to replacing my own build sometime this year.

Quick question please: Sata 3 SSD running on a Sata 2 motherboard - Yes?, No?, Very slow?, Complications? The SSD in mind might be an ADATA SX900 type.

This question will be part of my decision to either get an SSD and better GPU, or a complete new build this year.

I appreciate that if I do replace my GPU, I'd probably need to remove a bottom case fan in my RV02 to fit a larger card.

Summer here is really hot and humid, and money is tight so I won't be overclocking again yet, (and I'd also need faster ram too).

Anyway, cheers for your advice.
 
Last edited:
Yea a SATA3 SSD will run on a SATA2 connection just fine but it will limit the speed.

Modern SATA3 SSDs can produce 400+ mb/s sequential read speeds but a SATA2 interface will only give you half that. Though that will still be nearly 50% faster than an HDD.

Yes it would be measurably slower but not particularly noticeable unless data speeds is a 'thing' for you which for most people it isn't. In general a better GPU would be a more sensible upgrade.
 
Is there any other evidence other than this one rogue video? It really needs more to back it up...from other sources I mean.

Actually I just upgraded my 2500k to a 3770k on Saturday, Now I'm looking for someone willing to straight swap a 2500k for a 8320 or 8350 and hopefully I can grab a decent mobo for $50 - 80. Where I will then proceed to swap everything on my Intel build onto the AMD one and run some tests.

Btw if anyone on here in the Toronto area is willing to do the swap PM me.
 
Last edited:
Back
Top