dual 760 vs 1070

Erkki Muhonen

New member
hey,
i have a gigabyte GV-N760OC-4GD REV2.0 . Overclocked nicely to 1316mhz and 6600 on the memory.
I found a used one for roughly 100 bucks and considering this over buying a new 1070, just want a tiny bump as I do most of my gaming on a 27" 1080p monitor at 60hz. (can overclock to 75). Is it a good idea?
See signature for other specs...
im a very casual gamer, but enthusiast enough to overclock a little and play with these kin d of things...my i5-4690K i still clocked at 4690 stable with good temps, below 80 usually in spring/autumn, but can get up to 86 in summer though...

so basically, i can afford both a 1070 or 1080, but im not sure if i want to spend that money yet.
 
Well I don't own one but I think 1080 on 1080p 60hz is overkill. Personally I would say get a 1070.

760s are showing their age now and I am the sort who usually upgrades every other generation.
 
Sli and Crossfire as we knew them are dead.

Since we got the new consoles devs now have to do much less than they did before. I will elaborate on that a little, then you can make your own mind up.

OK, so with a console like the 360 or PS3 they used hardware that was pretty much completely different to a PC. This meant that they had to do a lot of work creating libraries (portions of code) to make the game work on a PC.

So basically the graphics technology and so on were completely different in the consoles and they had to basically rewrite the graphics and CPU handling. When they were doing that they were also adding in AFR support (alternate frame rendering). There was no point in not putting it in, seeing as you had to do the spade work whether you liked it or not.

AMD and Nvidia were simply giving them instructions on how to do it and they were adding it in. At which point AMD or Nvidia tweak their drivers and you have multi GPU support.

But.

With these new console "ports" (they are anything but) the hardware they are writing for is pretty much a small PC in disguise. I'm generalising here, but I really don't want to pick the peanuts out of the poo on how similar/different they are.

As such devs now don't have to do hardly anything to make that game run on the PC. The consoles use an X86 set up of sorts and this is why Microsoft are now back to releasing PC games because they hardly have to do anything to make them run on a PC. They can almost take a Xbone game, create a PC .exe and run it on a PC.

And that's why you have seen Crossfire and SLi support basically fall off and die. Why would devs waste a single minute of their time or effort rewriting the GPU handling so that people with two cards can benefit?

On the contrary, nearly all of the current games are not AFR friendly. Nvidia and AMD are doing their best, but even with for example Doom Nvidia released a SLi driver for it that actually makes the cards scale negatively and you lose tons of performance over one card and the minimums are quite frankly awful.

Crossfire is even worse. I will use Need for speed as an example but you can pretty much apply this to any modern release. Basically when the game initially launched it stuttered and froze like a sonbitch. AMD released a driver that basically made it worse. Then they released another one that stopped the flickering and stuttering but the scaling was negative and the game ran like poo.

In the end they simply added in a few lines of text into their release notes instructing the user to basically disable Crossfire as it did not work.

This is also why AMD just released the Fury Pro Duo for, quote, "VR development only".

They know that their days are numbered.

In the green camp Nvidia have basically washed their hands of 3-4 way SLi but are still recommending two way SLi. Trust me when I say, it's a con. It simply does not work and hasn't for a long time.

Moving forward...

As I said, Crossfire and SLi are now dead as we knew them. AFR support has been non existent ever since these new consoles came along.

There are a few multi GPU technologies coming, but so far there is only one DX12 game that actually uses it and makes it work and the scaling is pretty bad. Nowhere near as good as a proper AFR title.

Liquid VR supposedly uses more than one GPU also, this is why AMD were keen to push the Fury Pro Duo onto devs. However, what will come of it is a mystery also, because it's too early to say.

So there you go, more than enough info for you to consider and chew on and make your own decision with :)
 
I currently run 2 1070s and SLI is fine for everything I've played so far. Excluding Fallout 4 but that appears to be a fault with the game and high refresh rate monitors.

As always though, I'd recommend getting one more powerful GPU over two older and slower cards.
 
I currently run 2 1070s and SLI is fine for everything I've played so far. Excluding Fallout 4 but that appears to be a fault with the game and high refresh rate monitors.

As always though, I'd recommend getting one more powerful GPU over two older and slower cards.

Fallout 4 has been one of the games where it works, then it doesn't, then it does etc.

AFAIK Bethesda put in AFR support but then broke it about two days later with a patch.

So glad I am off Crossfire now.
 
Fallout 4 has been one of the games where it works, then it doesn't, then it does etc.

AFAIK Bethesda put in AFR support but then broke it about two days later with a patch.

So glad I am off Crossfire now.

Bethesda appeared to have not released a patch 'supporting' 10xx cards yet. I did force the game to work by running at 60hz however the game didn't detect where my monitor was correctly so I had half of my other monitor with a black screen. Fallout's engine is tied to the framerate to so if it does work correctly my game will be running at a full 100ish fps, which should make my attacks quite a bit faster..

Bethesda are at fault here i suspect. Its a shame as so far it has been the only game I've played in which it doesn't work since changing my system up a bit.
 
Bethesda appeared to have not released a patch 'supporting' 10xx cards yet. I did force the game to work by running at 60hz however the game didn't detect where my monitor was correctly so I had half of my other monitor with a black screen. Fallout's engine is tied to the framerate..

Bethesda are at fault here i suspect. Its a shame as so far it has been the only game I've played in which it doesn't work since changing my system up a bit.

Elder scrolls still doesnt accept 120hz refresh.
 
Elder scrolls still doesnt accept 120hz refresh.

Neither does Fallout 4.

Back when I was on my Acer 4k monitor I hacked the ini and removed Vsync. I decided to run the game at 1440p so disabled Vsync so the game would run faster.

It just broke the mechanics completely.

Now I run 1440p @ 70hz and even at 70 FPS (which is lovely, it really zips along beautifully) I get the odd random freeze when I use a console.

Most of the games these days are simply coded to run at 60 FPS and nothing more :(

Shame really.
 
Neither does Fallout 4.

Back when I was on my Acer 4k monitor I hacked the ini and removed Vsync. I decided to run the game at 1440p so disabled Vsync so the game would run faster.

It just broke the mechanics completely.

Now I run 1440p @ 70hz and even at 70 FPS (which is lovely, it really zips along beautifully) I get the odd random freeze when I use a console.

Most of the games these days are simply coded to run at 60 FPS and nothing more :(

Shame really.

Its the only game I've bought recently that has been any trouble, despite running 1440p 144hz with dual, new, gpus. Beforehand, I never had any issues with 144hz, again excluding Fallout. Saying that my Ti would run it like a dog so I don't think I ran into any issues when the game actually worked thanks to a near console framerate.
 
Its the only game I've bought recently that has been any trouble, despite running 1440p 144hz with dual, new, gpus. Beforehand, I never had any issues with 144hz, again excluding Fallout. Saying that my Ti would run it like a dog so I don't think I ran into any issues when the game actually worked thanks to a near console framerate.

Oh yeah for sure, if you crank the detail it really does make a card cry. I've got it on completely max details (inc the new weapon details and god rays on ultra) and I have seen 43 FPS. Most of that though was on the latest DLC (Far Harbor) where they have really, really pushed the detail and fog etc. It looks like a different game !

Hoping Nuka World is good, though lately I have been dumping a shed load of hours into Dead Island : Definitive Edition and DI Riptide DE. They both look absolutely stunning, I mean better than Crysis 3 in places ! They've obviously used the Dying Light engine but the water and so on are just absolutely breathtaking.
 
Oh yeah for sure, if you crank the detail it really does make a card cry. I've got it on completely max details (inc the new weapon details and god rays on ultra) and I have seen 43 FPS. Most of that though was on the latest DLC (Far Harbor) where they have really, really pushed the detail and fog etc. It looks like a different game !

Hoping Nuka World is good, though lately I have been dumping a shed load of hours into Dead Island : Definitive Edition and DI Riptide DE. They both look absolutely stunning, I mean better than Crysis 3 in places ! They've obviously used the Dying Light engine but the water and so on are just absolutely breathtaking.

Not sure what the point of cranking Fallout 4 to max detail is since the gfx on the game in general are so outdated and poor. I'd sacrifice that for fluid motion any day.
 
Not sure what the point of cranking Fallout 4 to max detail is since the gfx on the game in general are so outdated and poor. I'd sacrifice that for fluid motion any day.

Because it makes it look quite pretty tbh. The lighting really changes and comes to life with the godrays on max.

It is a dog though in general tbh. Compared to something like DI DE it's pants and all looks rather fuzzy.

I can accept 40 odd FPS min too, it's hardly an action packed game and most of it is done with the engine frozen (VATS).
 
thanks for all the feedback...well, the things im considering are factors such as memory bandwidth where the 1070 has 256.3GB/sec vs 384.5GB/sec on the 760. and also the texture rate is still higher on the 760(+shader processing units and texture mapping). But clearly what stands in 1070's favour is pixel rate, power draw, twice memory and higher clocks. How much this means in 1080p gaming at 60hz im not so sure. which is why 760 SLI kind of "sounds" a fun test. However, a newer card gives extra support for other new tech which would be fun to fiddle around with...hmmm...maybe ill test both and just throw the 760's in my old system :)
 
thanks for all the feedback...well, the things im considering are factors such as memory bandwidth where the 1070 has 256.3GB/sec vs 384.5GB/sec on the 760. and also the texture rate is still higher on the 760(+shader processing units and texture mapping). But clearly what stands in 1070's favour is pixel rate, power draw, twice memory and higher clocks. How much this means in 1080p gaming at 60hz im not so sure. which is why 760 SLI kind of "sounds" a fun test. However, a newer card gives extra support for other new tech which would be fun to fiddle around with...hmmm...maybe ill test both and just throw the 760's in my old system :)

The higher the FPS the better tbh. It's all about minimums and at 1440p even the 1070 would be down in the 40s.

Just because you have a 1080p monitor it doesn't mean you can't run games at higher resolutions. I'm not going to hold your hand, but Google for "Nvidia DSR". It's quite easy to set up and you can go all the way to 4k even on a basic 1080p monitor.

The 1070 should pretty much murder 1080p. However, I would definitely increase the res using DSR and you can pick and choose which games you want to run at whatever resolution. I found that Fallout 4 looked the best at 1440p, as at 4k the textures appeared to be no different.

I also don't know where you're located, but keep an eye on 980ti prices. Here in the UK they're hitting around £300 for good cards with some warranty left.
 
yer, well, in norway the 980ti still goes for about 400£, so actually just the same as the msi 1070 here, so thats a no brainer really...DSR is possible, it doesnt work well with my other monitor which is 1440*900 so haven't played around with it on a 1080p monitor actually...:D
 
IF fallout works I can let you know what the FPS is like on 1440p. when I got it to semi work on SLI 1070s the framerate wouldn't go below 60 even with everything maxed out. The 1070 is a great card.
 
Apparently the very latest driver makes it scale negatively. It doesn't flicker or flash though but yeah, not so hot.

Fallout 4 simply wasn't designed for AFR and it would have been a whole world of work to get it working.

Don't even get me started on the Windows Store games. OMG Crossfire was like "WTF??"

When I am inside a building on FO4, any building, it remains jammed at 70 FPS. It's really nice. I find that it's smoother at 43 FPS than it is if it goes into the 50s. I could do with Gsync but I can live without it. Just.
 
I currently run 2 1070s and SLI is fine for everything I've played so far. Excluding Fallout 4 but that appears to be a fault with the game and high refresh rate monitors.

As always though, I'd recommend getting one more powerful GPU over two older and slower cards.

I tested Fallout 4 yesterday, didn't run into any issues despite using 120hz.
 
I tested Fallout 4 yesterday, didn't run into any issues despite using 120hz.
1080p though right? I think its also something to do with my resolution and that my secondary monitor is a different refresh rate and resolution.

Also,
It changes the refresh rate in windows to 60hz for me I've noticed. It cuts it out to the point where I can't actually choose 60hz+ in Nvidia's control panel and I have to do it through windows or restart.
 
Back
Top