Need help with my skylake build

Mate, if you honestly believe that a single game sets the benchmark for all DirectX 12 games and all currently released graphics cards then I'll leave you alone after this cause there's no discussing with people who refuse to be wrong.

...especially if you go to the official website of said single game and it has a pretty big AMD logo in the footer, but anyway...


That said, I'm assuming OP will not build this system to exclusively play games that aren't even actually out yet, so DX11 performance is definitely not irrelevant. Unless of course all he wants to play is Ashes of Singularity, then I take back everything I've said. But then again, "if you bought it you could play it" is definitely the best argument I've read all day.


Don't get me wrong, the Fury X is definitely a good card for a good price. That is a fact, and I'm sure it would serve StarKiller well if he plays a lot of the titles where it outperforms the 980 Ti.
But if he is going to buy it, I think he should do so because it's the best card for his intended purpose of the computer. Not because some guy on a forum told him to get an AMD card, because that's all that guy ever tells everyone to do.
Because if you can't be objective on a forum, you really shouldn't even be on it ";)"

I already told him it is the only DX12 bench/game we have atm. I didn't say anything beyond that. I'm not wrong in saying in this game at heavier loads the Fury X pulls ahead by about 4%. Saying that's wrong is also saying the article I linked for you is also wrong.

Having the logo doesn't mean anything. They aren't using any vendor specific tech. If it was using that then it would matter. AMD is just sponsoring it so people see it and think DX12 and AMD.

You're arguing over things I didn't say and acting like I'm a dumb narrowminded fanboy. I gave the OP an idea of what he will end up spending. He could get a Fury X Xfire slightly cheaper, get the blocks which are the same, and he plans on 1440p... getting a Freesync 1440p monitor. It comes out much cheaper. His budget is massive so if he still wants Nvidia he has the money for it. He is considering AMD and Nvidia. Why are you arguing? You're not being objective here and assuming he will listen to me.

You can say whatever you want about me only wanting to ever recommend AMD cards. Anything below a 980ti, AMD has a card that either matches or beats it's closest Nvidia competitor at a lower price. Tell me why I shouldn't recommend there cards? Past couple of GPU advice threads many people were recommending 390/390x's... does that make them fanboys too? No it doesn't. I have recommended Nvidia cards to. Some people only want Nvidia, fine, I recommend them the best card for there money. I recommend both on here and off here.

I'm being fairly objective. He wants his computer to last years.. as of now AMD seem to do slightly better at DX12. Who's to say that won't continue, they have the hardware capable for it. Even if it doesn't, performance is still crazy good and it's a cheaper setup. Why is it such a crime to let him know that?

You're a mod on a forum. If you can't help but argue and start bullshit arguments while suggesting to a member to leave you shouldn't be on it either.

You want to continue this, PM me. Let's not derail it further. Better yet you may as well delete all this. Totally derailed already and doesn't help the OP. This isn't how you introduce us to new members.
 
Last edited:
I have another question, due to the 6700K being hard to get currently, and the inflated process those actually selling them area adding, would it be worth considering going with a 5930K instead since currently the pricing would be about the same?

What would be the advantages/disadvantages of going with the X99 for a strictly gaming rig?
 
I have another question, due to the 6700K being hard to get currently, and the inflated process those actually selling them area adding, would it be worth considering going with a 5930K instead since currently the pricing would be about the same?

What would be the advantages/disadvantages of going with the X99 for a strictly gaming rig?

This question has been asked so many times since Skylake has launched because of the limited supply of the 6700k. Really comes down to personal preference. Personally If the 6700k was readily available, it would be a no brainer. But since it's not I wouldn't know what to do in all honesty. 6 cores 12 threads or 4 cores 8 threads but newer chipset. Tough choice. Being a pure gaming rig, I would probably go for the i5, but seeing as you have the budget for it, the 5930k would probably be the better choice. Most games don't use 6 cores now, but since you hold onto your rigs for a long time, it should fare better as time goes on. It does put out more heat so your cooling solution would need to be adequate temporarily since it doesn't come with any cooler and you plan on watercooling.

On the other hand if you want some of the newer features from Skylake, then that's the one you go for and you would still get amazing performance:)

Just realized.. you could also look for any 4790k. They have dropped in price a little bit and shouldn't be much more than say the 6600k. You get more threads with it since it's an I7.. however get an older chipset. Again though since you have the money, I'd probably go with the 5930k.
 
Last edited:

Mate, I don't see why you're being massively defensive? All I did was lay out arguments why the Fury X might not necessarily be the best purchase, even for DX12 games and you act as if I've just offended your entire family :huh:

I won't delete the discussion cause that's all it was; a discussion about a graphics card, which still has information that's relevant to StarKiller's decision what card he should buy. And I will continue that discussion, but to make you feel better I will also list some pros of going Fury x:

You're right about FreeSync , that's a good point. AOC recently released two monitors, both of which identical except one has G-Sync and one has FreeSync, except the G-Sync one is £300 and the FreeSync one was 200 quid.

Afaik though, the actual FreeSync ranges on the FS monitors are a little lower than G-syncs and I must also admit that I'm not quite sure if you can use FS with multi-GPU, and/or multi-monitor setups yet.

Another thing to consider with the Fury X is the AIO cooler. Temps-wise it's great having and it'll definitely keep it cool and (if you've not got a version with pump issues) quiet. As said though, there are some older versions with pump noise issues still going around but there shouldn't be many left now and you can RMA your card if yours does it.

On the other hand, I like to keep things simple mostly, and as proven by the pump noise issues, having an AIO loop does open up some more chances of failure at any point. Also, if you're an aesthetic nut like me, you might have a hard time making those tubes and the rad look pretty and clean inside a case :lol:

If you plan on adding another, I would personally stick with the 980 Tis. They consume less power, produce less heat (keep in mind with the AIO loop on the Fury X being effective as it is, if you have one radiator in the front of your case that's where all the heat comes out of) and you won't need to also have two 120mm rads + the CPU cooler radiator in your case.

Although you are going custom loop at some point aren't you?

I don't think we should consider DX12 performance at all yet at this point. After the two 'launch' titles, no games have even been announced yet to use the new API, let alone AAA titles which tend to be the games that need the most horsepower. By that time, as you said (StarKiller, that is) it might already be time to upgrade anyway.

I'm never really much of telling people to 'wait it out until X product is out', cause if you do that, X product will come out and then someone else announces Y product that you also want and you'll just be waiting forever. I generally just tell people to get what's best for their money right now, unless of course it's something obvious like a new high-end card coming out the day after.

I have another question, due to the 6700K being hard to get currently, and the inflated process those actually selling them area adding, would it be worth considering going with a 5930K instead since currently the pricing would be about the same?

What would be the advantages/disadvantages of going with the X99 for a strictly gaming rig?



Just realized.. you could also look for any 4790k. They have dropped in price a little bit and shouldn't be much more than say the 6600k. You get more threads with it since it's an I7.. however get an older chipset. Again though since you have the money, I'd probably go with the 5930k.

This is what I was going to say as well. In fact, even an Ivy Bridge i5 3570k won't perform much worse than the 5930k in most games because most simply do not use the extra cores yet.

I could argue that X99 is more 'future proof' or that with Skylake you might be able to do another CPU upgrade on the same motherboard, but the reality is that I really thought with the 8 core consoles we'd see better multi-core game performance but that still hasn't happened, and a with any Intel next generation mainstream CPU, it generally isn't worth upgrading from the second newest to the newest.

If you have the money for it, by all means go X99 and you'll have great performance and you won't need to upgrade for probably a very long time, but as NBD mentioned the Skylake motherboard do have quite a few extra features. If you don't use those though, I wouldn't consider Skylake personally. X99 has no real benefit over the mainstream platforms for strictly gaming so I'd PERSONALLY (but then again my rig budgets are usually quite a bit lower than yours) go Devil's Canyon and spend the rest of the money on a nice monitor or just something different completely that you want to buy :)
 
Last edited:
I have another question, due to the 6700K being hard to get currently, and the inflated process those actually selling them area adding, would it be worth considering going with a 5930K instead since currently the pricing would be about the same?

What would be the advantages/disadvantages of going with the X99 for a strictly gaming rig?

TBH I see no sense in going above Z97 ATM unless you are building a brand new rig and have no gaming capable PC
 
Just a quick reply, i agree with NBD, dx12 is coming out, and OP plans to use his PC for many years, seeing how Nvidia also forgets their older cards in driver updates it seems more logical to go with AMD, and im not a fanboy, btw Feronix i think you were a bit harsh, being a mod you need to have calm attitude and not tell people to leave the forums. Now back on topic.
AMD cards support dx12 at some extent or completely, 980TI doesnt support much of dx12, they do with driver updates, but you can squeze just a lil via software. If you will change GPU's in next 2 years, take 980TI, if you plan to keep them longer, go with FuryX.
 
Just a quick reply, i agree with NBD, dx12 is coming out, and OP plans to use his PC for many years, seeing how Nvidia also forgets their older cards in driver updates it seems more logical to go with AMD, and im not a fanboy, btw Feronix i think you were a bit harsh, being a mod you need to have calm attitude and not tell people to leave the forums. Now back on topic.
AMD cards support dx12 at some extent or completely, 980TI doesnt support much of dx12, they do with driver updates, but you can squeze just a lil via software. If you will change GPU's in next 2 years, take 980TI, if you plan to keep them longer, go with FuryX.

As said before DX12 is not a valid argument in my opinion as it's not actually out yet and there is no way of telling how exactly it's going to affect the performance yet. A single game doesn't proof anything cause there are titles where a 980 (non Ti) out-performs the Fury X. Clearly though, that does not make the 980 the better card.

I'm not sure what you mean by them not supporting older cards with their drivers, as if I go to the driver page of Nvidias latest drivers, they support cards all the way back to the GTX 400 series which were released in 2010, so a 5 year plus support cycle is really not that bad.

http://www.geforce.com/drivers/results/90494

Other than that I am perfectly calm and I never told anyone to leave the forums, we need healthy debates. But if people can't remain objective, there's no discussing with them and it doesn't really help anyone.
 
Fury/X works very well in Crossfire when it works. I've seen two Fury X easily swat aside a pair of Titan X due to the scaling on Crossfire.

However, Fury and Fury X and Titan X are all 4k cards. It's as simple as that, you really should not be buying them otherwise.

At 4k a stock Fury X and a stock 980ti are absolutely dead tied. There is something like 2% between them overall.

http://www.hardwarecanucks.com/foru.../69682-amd-r9-fury-x-review-fiji-arrives.html

It's only at 1080p and 1440p that the 980ti wins, but, as I have said at 4k the gap really tightens and there is nothing to separate the two. Some one else has already mentioned how the AMDs are better at higher resolutions than they are at lower...

As I said though, pair of Fury/X will easily swat aside a pair of Titan X due to poor SLI scaling (it's always been this way, though).

As for the DX12 arguments? haha. Right now we have one game with a built in benchmark that's about as buggy as a box full of cockroaches. It's poorly coded and not from a major developer. So, until we get some cleanly coded DX12 games it's any one's guess as to whether AMD are going to come out on top or not. Nvidia's usual fix is to throw clock speed at it and it's been working recently.

Whether or not their cards can overcome the shortcomings they have with the technology? no one knows yet.
 
Mate, I don't see why you're being massively defensive? All I did was lay out arguments why the Fury X might not necessarily be the best purchase, even for DX12 games and you act as if I've just offended your entire family :huh:

You're right about FreeSync , that's a good point. AOC recently released two monitors, both of which identical except one has G-Sync and one has FreeSync, except the G-Sync one is £300 and the FreeSync one was 200 quid.

Afaik though, the actual FreeSync ranges on the FS monitors are a little lower than G-syncs and I must also admit that I'm not quite sure if you can use FS with multi-GPU, and/or multi-monitor setups yet.

This is what I was going to say as well. In fact, even an Ivy Bridge i5 3570k won't perform much worse than the 5930k in most games because most simply do not use the extra cores yet.

I could argue that X99 is more 'future proof' or that with Skylake you might be able to do another CPU upgrade on the same motherboard, but the reality is that I really thought with the 8 core consoles we'd see better multi-core game performance but that still hasn't happened, and a with any Intel next generation mainstream CPU, it generally isn't worth upgrading from the second newest to the newest.

If you have the money for it, by all means go X99 and you'll have great performance and you won't need to upgrade for probably a very long time, but as NBD mentioned the Skylake motherboard do have quite a few extra features. If you don't use those though, I wouldn't consider Skylake personally. X99 has no real benefit over the mainstream platforms for strictly gaming so I'd PERSONALLY (but then again my rig budgets are usually quite a bit lower than yours) go Devil's Canyon and spend the rest of the money on a nice monitor or just something different completely that you want to buy :)

I was being defensive because you didn't exactly come off as non-offensive and the way it was all worded was really sarcastic/mockery. To make it clear, I understand you think DX12 isn't valid because we have a long way till it's mainstream. However consider he also holds on to rigs for "ages" as he said, as of now in DX12 the Fury X has more potential to succeed and also does better at the heavier loads your more likely to see in gameplay than the 980ti. Yes it's one game, but as long as Asnych Compute is being used(and trust me devs are going to use this alot), FX has the more potential. On a hardware level its just better, that's what I am basing it off. It's the best bet for the long run. In DX11 the 980ti is faster generally by about 5-10% depending on the game, sometimes it's slower. Really just depends.

Freesync has more range than Gsync, as it can go from 9-240hz. However we have a long way before scalar tech gets us close to there. As of now the best we have from any FS monitor is currently 40-144hz from the BenQ XL2730Z. Xfire works with FS as well as that was addressed in driver 15.7 I believe. If not then well it's still worth getting simply because it's far cheaper than the Swift.

Ever since the XDMA thing AMD has had on there 290 series/390 series/Fury series they have scaled better in multi-gpu then Nvidia cards. The Fury X in Xfire vs a 980TI SLI is so close it becomes game dependent when you have stock FX vs moderately clocked 980ti's. Super high clocked ones will pull ahead, however you can OC the FX and it'll close the gap, since it's limited it won't catch up though until they get that sorted.

So really I just recommended this setup as it's cheaper to the Nvidia equal- FX in Xfire/FS monitor/GPU blocks/Devils Canyon CPU and board. He'll save couple hundred mainly from the monitor, but that's something he could put towards the X99 he wanted and not waste money. Or he could put it towards a large SSD or better fans or more memory etc. Going the Nvidia route will still serve him just as great, however more expensive.

Nice attitude.

Not attitude at all..
Oh and nice useless post that contributes nothing towards the thread(see that's attitude).

I'm not sure what you mean by them not supporting older cards with their drivers, as if I go to the driver page of Nvidias latest drivers, they support cards all the way back to the GTX 400 series which were released in 2010, so a 5 year plus support cycle is really not that bad.

http://www.geforce.com/drivers/results/90494

Other than that I am perfectly calm and I never told anyone to leave the forums, we need healthy debates. But if people can't remain objective, there's no discussing with them and it doesn't really help anyone.

I was and always am being objective(if i'm not I will claim so). Just read the first reply.. won't repeat it here again.

What he is referring to about driver support isn't that they don't release new ones for older cards, its that a while ago claims went around that the newer drivers on older Nvidia cards were on purpose decreasing your performance so that it would make you want to upgrade to a new Nvidia GPU. While we have no clue about it being true, I wouldn't doubt it in all honesty due to there recent "issues"(that is also remaining objective, it's an observation, not an opinion).

Fury/X works very well in Crossfire when it works. I've seen two Fury X easily swat aside a pair of Titan X due to the scaling on Crossfire.

However, Fury and Fury X and Titan X are all 4k cards. It's as simple as that, you really should not be buying them otherwise.

At 4k a stock Fury X and a stock 980ti are absolutely dead tied. There is something like 2% between them overall.

As for the DX12 arguments? haha. Right now we have one game with a built in benchmark that's about as buggy as a box full of cockroaches. It's poorly coded and not from a major developer. So, until we get some cleanly coded DX12 games it's any one's guess as to whether AMD are going to come out on top or not. Nvidia's usual fix is to throw clock speed at it and it's been working recently.

Do you even own the game? I do.. it's not buggy for me whatsoever. Not any crashing issues like some people are reporting on the forums.

I haven't seen a FX in Xfire being on par with a TX in SLI. If anything it would be stock TX and OC'd FX. TX is just to much. It would have to have some driver scaling issues for it to be beaten consistently. TX is simply the fastest you can get.
 
What he is referring to about driver support isn't that they don't release new ones for older cards, its that a while ago claims went around that the newer drivers on older Nvidia cards were on purpose decreasing your performance so that it would make you want to upgrade to a new Nvidia GPU. While we have no clue about it being true, I wouldn't doubt it in all honesty due to there recent "issues"(that is also remaining objective, it's an observation, not an opinion).

That got shut down very quickly after it was brought up. I kept a very close eye on this (as I have an older nVidia card) and it was just a slight driver issue in certain games that haven't popped up since. IIRC it all started on the Witcher 3 where the 970 was beating the 780Ti by a good few frames.

I was being defensive because you didn't exactly come off as non-offensive and the way it was all worded was really sarcastic/mockery.
....

Not attitude at all..
Oh and nice useless post that contributes nothing towards the thread(see that's attitude).

.
That IS attitude or could be interpreted as attitude. You're being quite aggressive the moment someone says DX12 isn't really a valid reason to buy a card right now.
 
Yes I own the game and I've benched it. It's a low rent game with menus that don't work properly so you have to hack it around to make the max settings work.

When I see something more professional from some one like DICE (who successfully used Mantle in BF4 and showed the gains to be had by using a better API than DX11) I will take the results more seriously.
 
That got shut down very quickly after it was brought up. I kept a very close eye on this (as I have an older nVidia card) and it was just a slight driver issue in certain games that haven't popped up since. IIRC it all started on the Witcher 3 where the 970 was beating the 780Ti by a good few frames.


That IS attitude or could be interpreted as attitude. You're being quite aggressive the moment someone says DX12 isn't really a valid reason to buy a card right now.

Not aggressive, DX12 is a valid reason. It's being used around the world atm.. either way interpretations are different to every person. Don't see a point in talking about it anymore. Trying to help the OP here and not hijack the thread.

Yes I own the game and I've benched it. It's a low rent game with menus that don't work properly so you have to hack it around to make the max settings work.

When I see something more professional from some one like DICE (who successfully used Mantle in BF4 and showed the gains to be had by using a better API than DX11) I will take the results more seriously.

It's pre alpha. Menu's aren't even finalized and it's the last thing that should be worked on. What's matters right now is what they are improving upon. They even have a patch due sometime before mid of next week that will address GUI as they are about to go into closed beta. No need to hack anything, not that you could anyways in it. Want max settings? Pick those settings and restart.. done.

It's professional and done by the same company who also demonstrated Mantle(far better than Dice ever did) in Star Swarm. Should take them more seriously now, as it gets better and better than it already is will only make it more valid.. Take it how you want though, it's your opinion and we all know what happens when you won't change your mind as we've seen before^_^

OP have you decided how the build will end up being or what direction you want to go?
 
It's pre alpha. Menu's aren't even finalized and it's the last thing that should be worked on. What's matters right now is what they are improving upon. They even have a patch due sometime before mid of next week that will address GUI as they are about to go into closed beta. No need to hack anything, not that you could anyways in it. Want max settings? Pick those settings and restart.. done.

It's professional and done by the same company who also demonstrated Mantle(far better than Dice ever did) in Star Swarm. Should take them more seriously now, as it gets better and better than it already is will only make it more valid.. Take it how you want though, it's your opinion and we all know what happens when you won't change your mind as we've seen before^_^

OP have you decided how the build will end up being or what direction you want to go?

Yes, it's pre alpha. By a somewhat small dev team.

Was it coded purely with DX12 in mind? or, was that added in later? because I tell you, I've seen code that's been ported before and it never pans out well.

Like the sly dig at the end there, though. Just makes everything you've said before it irrelevant so it's clear the path you take when some one takes you on at a debate level.

Not the first time either, is it?
 
Personal attacks or underhanded comments will result in time-out bans, you have been warned.

Keep this thread related to new build advice.

Starkiller, have all your questions been addressed or do you require any more advice?
 
Personal attacks or underhanded comments will result in time-out bans, you have been warned.

Keep this thread related to new build advice.

Starkiller, have all your questions been addressed or do you require any more advice?

I'm good, thanks
 
I already told him it is the only DX12 bench/game we have atm. I didn't say anything beyond that. I'm not wrong in saying in this game at heavier loads the Fury X pulls ahead by about 4%. Saying that's wrong is also saying the article I linked for you is also wrong.

If you really believe that the AMD cards are faster can you do something please.

Pick any resolution and workload and post your best score on your card for AOTS and I will beat it by a wide margin on one of my NVidia cards using the same settings.
 
Wanted to pay a quick update. All my parts are in, and a for home yesterday from back surgery, so it will likely bee a few days before I'll be it to assembling the new rig. :(
 
Wanted to pay a quick update. All my parts are in, and a for home yesterday from back surgery, so it will likely bee a few days before I'll be it to assembling the new rig. :(

Don't over do it. Builds can wait but you only get one Spine. Rest up.
 
Back
Top