Scoob
New member
Hi,
Thought this might be of interest to those running an nVidia Graphics Card who are considering going SLI, a jump I’ve recently taken.
I already had a Palit GTX 570 in my rig and was very pleased with it. I’d served me well on my Q6600 @ 3.6 and it was allowed to really open up when I put it in my new Sandy Bridge system.
The rig:
My current gamer is using a Sandy Bridge 2500k overclocked to 4.6ghz on an ASUS P8Z68-V Pro motherboard with 2x4gb of Corsair Vengeance 1600mhz ram. I’m running Windows 7 Ultimate 64bit on a conventional Samsung 1tb hdd.
The GPU's:
I recently managed to pick up a 2nd GTX 570 fairly cheap to enable me to venture into the realms of SLI. Now, and I think this is a point worth highlighting, my pair of 570’s were from different manufacturers and they were different revisions of the “reference” design.
My Palit was one of the first GTX 570’s available and as such is a pure nVidia reference design, with a small “blower” type fan near the back of a the card and the two 6 pin PCI-e power connectors on the top of the card.
My new Inno3D card on the other hand, while saying “Reference” on the box was a different design! It was shorter by 3cm, had the PCI-e power connectors on the end of the card and had a large, central conventional fan. Looking more closely it also uses a heat-pipe cooler rather than the vapour chamber of the earlier model.
So, I have two GTX 570’s, both at the same vanilla clocks, but equally both of different designs – would this cause a problem?
The PSU:
Now, I’d already installed an uprated PSU in my rig in the form of a Corsair HX750w. This is a modular PSU and gave me the 4x PCI-e power connectors I needed. The PCI-e power connectors are all of the 6+2 pin variety, the 570’s only needs 2x 6pin. This PSU provides 62A on the 12v line.
Connectivity:
One thing to remember, it’s rare for an nVidia graphics card to come with the required SLI Bridge connector – this is something traditionally shipped with SLI capable motherboards. My ASUS obviously fits in to this category so I used the one in the box. It was only a short cable so I need to use two PCI slots close together. Using these two slots gave me a gap of about 2cm between the bottom of one card and the top of the other – allowing reasonable air flow one would hope. I’ve seen pictures of GPUs crammed together with basically NO gap & I’ve wondered how this can ever work with air cooling.
A quick test beforehand:
As I’d upgraded my PSU to be ready for SLI I ran a load of tests with my original GPU etc. just using the new PSU – just in case something wasn’t right. Stability and my overclock (as well as a GPU overclock to 800 core) worked just fine, so I was happy the PSU was behaving as it should.
I reverted to a “default” profile in MSI Afterburner, reverting my Palit GTX 570 back to its stock 732mhz core speed but retained my 4.6ghz CPU clock. I shut the PC down & opened it up.
Install:
I popped my new Inno3D GTX 570 in the PCI-e slot below my existing GPU, card went in without issue. I connected up the PCI-e power connectors, the new card being shorter means that having them on the end didn’t cause any problems. Nice.
I took the SLI bridge connector and popped it on both cards. There was a choice of TWO places to connect it – one assumes this is for when 3+ GPUs are in use – I popped it on the one nearer the front of the case.
Right, moment of truth! I’d not uninstalled my graphics card drivers or anything, I’d purely reverted the existing GPUs profile to stock in MSI afterburner. Let’s see what happened…
First boot:
PC began to boot, got to the Windows Logon screen (I use a password) it’s at this stage the PC obviously first detected the presence of the 2nd GPU as the screen went black for a fraction of a second a couple of times…not unexpected.
I logged on, had a little pop-up saying my new GPU had been detected! Nice. I went directly to the nVidia Control Panel via the right-click desktop menu and selected the option to Enable SLI Performance Mode. Screen went black for a fraction of a second and it was done!
I next went into my individual game profiles as I assumed I’d need to update these to enable SLI. Nope! SLI performance mode had been enabled for me on EVERY profile – now that is nice.
Ok, lets jump in and test!
Testing!:
I fired up Crysis 2 first, baptism of fire (well, hopefully NOT actual fire!) for my pair of GPUs! I loaded up & resumed my previous game….WOW, this is SILKY SMOOTH! I’d had all the settings maxed previously at 1920x1200 and the game played fine…however now it was…wow…really so very smooth!
I tried a couple of other games to be greeted by the same silky smoothness, no sign of micro-stutter, no issues where SLI wasn’t being used etc. just a flawless experience!
I next thought I’d try a few benchmarks…
3D Mark Vantage was first – I’d kept a record of my score with just the one card. My prior test, with physX disabled, I got an overall score of P17,000 (I’ve rounded down) with PhysX enabled my score had been P23,600. Not bad for a basic stock GTX 570.
So, let’s see what my SLI scores are like! Benchmark ran through smoothly, including all of the “feature” tests, no signs of artifacting or anything. Nice. At the end of the run my score, with PhysX DISABLED remember, was P33,192. I did another run with PhysX ENABLED and saw a score of P42,755 – Nice, very nice. Some decent scaling there!
Next I fired up 3D Mark ’11. My prior score had been P5701 with just the one card at stock, though I had broken 6000 with an overclock. At stock speeds with SLI my score jumped to P9,614 – not a bad jump. I did get a warning saying I was using un-approved drivers though, odd as I had some fairly recent (not latest) WHQL ones. Note: I have since updated to the 280.26 WHQL set and still see this message.
A little overclock...times TWO!:
Next I thought I’d see if my new GPU could match the modest 800mhz core speed (up from 732) I’d been running on my Palit. I used Afterburner to set both cards to 800mhz, each with a “one notch” voltage increase to ensure stability. Note: I had to set both card separately, both had different base voltages, the newer one being slightly higher – not unexpected.
Anyway, with 800mhz core on both cards I ran 3D Mark ’11 once again, with a score of P10157 this time. Not bad, though not double my original single-card scores, I guess there are other factors involved here with CPU playing a role of course.
I did some more gaming in Crysis 2 at this overclock and all appeared well, over the past few days I’ve played a variety of games – again all perfectly fine.
How hot and how loud? I said HOW LOUD?
:
So, I’m on AIR cooling, what are noise and temperatures like?
It’s natural that the top card will run a little hotter than the lower one – this is what I see. Additionally my lower card is the Inno3D with a different cooler, so this might have some impact on the temperatures I see.
It’s a cool day today, but the room feels warm, my rig has been sat idling for about 30 mins. According to afterburner my TOP card is at 38c, while my lower card is at 31c. Both cards have of course dropped into their idle state. My CPU for reference is at 29c on the hottest core.
When benching, my top-most card (Palit) tends to hit a high of around 73c during the most intensive parts, the lower card tends to be 5c below that. I do run a fairly aggressive fan profile so it does get a little noisy, though that’s so subjective I cannot really comment any more than that. FYI my fans spin up to ~70% on the Palit.
During gaming in something like the rather demanding Crysis 2 I see similar temps. Older/less demanding titles might not see me break the low-60s on the hotter card.
Air-flow:
I do have a larger 140mm fan on the front of my case providing a reasonable degree of cool air to both GPUs. I also have an additional 120mm fan angled at 45 degrees in the base of the case to ensure a little extra air gets to the top most card. Not particularly scientific, but it works.
One thing I have done to help is add an additional EXTRACT fan on my case side panel directly over the GPUs. My Palit vents most of its heat right out the back, as per the reference design. The Inno3D on the other hand, with its different design, vents a lot of it into the case. This additional 120mm fan (quiet) does a good job of venting the excess air. I’m glad too as during gaming this air really is rather warm, a testament to the cards cooling doing its job.
Conclusion:
So, there you have it. My first venture into SLI a resounding success! I’ve had no strange issues, no micro-stutter, every game I’ve tried has just worked – this includes Older titles such as Oblivion, Fallout 3 NV, X3 – Terran Conflict and Supreme Commander Forged Alliance. I even gave the very demanding Metro 2033 benchmark a go with all settings turned up and managed an average of ~60fps.
In summary I’m dead happy I went SLI, it’s been a pain-free experience, it makes a staggering difference to my frame rates, and the additional noise levels really don’t bother me so much. Actually, my old card is still the noisier one being an older design. Possibly swapping the cards over would make things quieter as well as reduce temperatures a little.
For the record I’ve had over a week of using SLI now and (touch wood) I’ve experienced no issues. It staggers me that I now have waayyy beyond a single GTX 580 in terms of performance. My GPU’s in all cost me £450, but that’s been spaced out over the year and there’s no way I could have gotten such a boost so easily without going SLI for anywhere near that cost. I’m a happy chappy
Hope people enjoyed my rather long post, I hope it will prove of use to those maybe considering a 2nd GPU
Next up will be watercooling the lot, which I’m planning at the moment.
Cheers,
Scoob.
Thought this might be of interest to those running an nVidia Graphics Card who are considering going SLI, a jump I’ve recently taken.
I already had a Palit GTX 570 in my rig and was very pleased with it. I’d served me well on my Q6600 @ 3.6 and it was allowed to really open up when I put it in my new Sandy Bridge system.
The rig:
My current gamer is using a Sandy Bridge 2500k overclocked to 4.6ghz on an ASUS P8Z68-V Pro motherboard with 2x4gb of Corsair Vengeance 1600mhz ram. I’m running Windows 7 Ultimate 64bit on a conventional Samsung 1tb hdd.
The GPU's:
I recently managed to pick up a 2nd GTX 570 fairly cheap to enable me to venture into the realms of SLI. Now, and I think this is a point worth highlighting, my pair of 570’s were from different manufacturers and they were different revisions of the “reference” design.
My Palit was one of the first GTX 570’s available and as such is a pure nVidia reference design, with a small “blower” type fan near the back of a the card and the two 6 pin PCI-e power connectors on the top of the card.
My new Inno3D card on the other hand, while saying “Reference” on the box was a different design! It was shorter by 3cm, had the PCI-e power connectors on the end of the card and had a large, central conventional fan. Looking more closely it also uses a heat-pipe cooler rather than the vapour chamber of the earlier model.
So, I have two GTX 570’s, both at the same vanilla clocks, but equally both of different designs – would this cause a problem?
The PSU:
Now, I’d already installed an uprated PSU in my rig in the form of a Corsair HX750w. This is a modular PSU and gave me the 4x PCI-e power connectors I needed. The PCI-e power connectors are all of the 6+2 pin variety, the 570’s only needs 2x 6pin. This PSU provides 62A on the 12v line.
Connectivity:
One thing to remember, it’s rare for an nVidia graphics card to come with the required SLI Bridge connector – this is something traditionally shipped with SLI capable motherboards. My ASUS obviously fits in to this category so I used the one in the box. It was only a short cable so I need to use two PCI slots close together. Using these two slots gave me a gap of about 2cm between the bottom of one card and the top of the other – allowing reasonable air flow one would hope. I’ve seen pictures of GPUs crammed together with basically NO gap & I’ve wondered how this can ever work with air cooling.
A quick test beforehand:
As I’d upgraded my PSU to be ready for SLI I ran a load of tests with my original GPU etc. just using the new PSU – just in case something wasn’t right. Stability and my overclock (as well as a GPU overclock to 800 core) worked just fine, so I was happy the PSU was behaving as it should.
I reverted to a “default” profile in MSI Afterburner, reverting my Palit GTX 570 back to its stock 732mhz core speed but retained my 4.6ghz CPU clock. I shut the PC down & opened it up.
Install:
I popped my new Inno3D GTX 570 in the PCI-e slot below my existing GPU, card went in without issue. I connected up the PCI-e power connectors, the new card being shorter means that having them on the end didn’t cause any problems. Nice.
I took the SLI bridge connector and popped it on both cards. There was a choice of TWO places to connect it – one assumes this is for when 3+ GPUs are in use – I popped it on the one nearer the front of the case.
Right, moment of truth! I’d not uninstalled my graphics card drivers or anything, I’d purely reverted the existing GPUs profile to stock in MSI afterburner. Let’s see what happened…
First boot:
PC began to boot, got to the Windows Logon screen (I use a password) it’s at this stage the PC obviously first detected the presence of the 2nd GPU as the screen went black for a fraction of a second a couple of times…not unexpected.
I logged on, had a little pop-up saying my new GPU had been detected! Nice. I went directly to the nVidia Control Panel via the right-click desktop menu and selected the option to Enable SLI Performance Mode. Screen went black for a fraction of a second and it was done!
I next went into my individual game profiles as I assumed I’d need to update these to enable SLI. Nope! SLI performance mode had been enabled for me on EVERY profile – now that is nice.
Ok, lets jump in and test!
Testing!:
I fired up Crysis 2 first, baptism of fire (well, hopefully NOT actual fire!) for my pair of GPUs! I loaded up & resumed my previous game….WOW, this is SILKY SMOOTH! I’d had all the settings maxed previously at 1920x1200 and the game played fine…however now it was…wow…really so very smooth!
I tried a couple of other games to be greeted by the same silky smoothness, no sign of micro-stutter, no issues where SLI wasn’t being used etc. just a flawless experience!
I next thought I’d try a few benchmarks…
3D Mark Vantage was first – I’d kept a record of my score with just the one card. My prior test, with physX disabled, I got an overall score of P17,000 (I’ve rounded down) with PhysX enabled my score had been P23,600. Not bad for a basic stock GTX 570.
So, let’s see what my SLI scores are like! Benchmark ran through smoothly, including all of the “feature” tests, no signs of artifacting or anything. Nice. At the end of the run my score, with PhysX DISABLED remember, was P33,192. I did another run with PhysX ENABLED and saw a score of P42,755 – Nice, very nice. Some decent scaling there!
Next I fired up 3D Mark ’11. My prior score had been P5701 with just the one card at stock, though I had broken 6000 with an overclock. At stock speeds with SLI my score jumped to P9,614 – not a bad jump. I did get a warning saying I was using un-approved drivers though, odd as I had some fairly recent (not latest) WHQL ones. Note: I have since updated to the 280.26 WHQL set and still see this message.
A little overclock...times TWO!:
Next I thought I’d see if my new GPU could match the modest 800mhz core speed (up from 732) I’d been running on my Palit. I used Afterburner to set both cards to 800mhz, each with a “one notch” voltage increase to ensure stability. Note: I had to set both card separately, both had different base voltages, the newer one being slightly higher – not unexpected.
Anyway, with 800mhz core on both cards I ran 3D Mark ’11 once again, with a score of P10157 this time. Not bad, though not double my original single-card scores, I guess there are other factors involved here with CPU playing a role of course.
I did some more gaming in Crysis 2 at this overclock and all appeared well, over the past few days I’ve played a variety of games – again all perfectly fine.
How hot and how loud? I said HOW LOUD?

So, I’m on AIR cooling, what are noise and temperatures like?
It’s natural that the top card will run a little hotter than the lower one – this is what I see. Additionally my lower card is the Inno3D with a different cooler, so this might have some impact on the temperatures I see.
It’s a cool day today, but the room feels warm, my rig has been sat idling for about 30 mins. According to afterburner my TOP card is at 38c, while my lower card is at 31c. Both cards have of course dropped into their idle state. My CPU for reference is at 29c on the hottest core.
When benching, my top-most card (Palit) tends to hit a high of around 73c during the most intensive parts, the lower card tends to be 5c below that. I do run a fairly aggressive fan profile so it does get a little noisy, though that’s so subjective I cannot really comment any more than that. FYI my fans spin up to ~70% on the Palit.
During gaming in something like the rather demanding Crysis 2 I see similar temps. Older/less demanding titles might not see me break the low-60s on the hotter card.
Air-flow:
I do have a larger 140mm fan on the front of my case providing a reasonable degree of cool air to both GPUs. I also have an additional 120mm fan angled at 45 degrees in the base of the case to ensure a little extra air gets to the top most card. Not particularly scientific, but it works.
One thing I have done to help is add an additional EXTRACT fan on my case side panel directly over the GPUs. My Palit vents most of its heat right out the back, as per the reference design. The Inno3D on the other hand, with its different design, vents a lot of it into the case. This additional 120mm fan (quiet) does a good job of venting the excess air. I’m glad too as during gaming this air really is rather warm, a testament to the cards cooling doing its job.
Conclusion:
So, there you have it. My first venture into SLI a resounding success! I’ve had no strange issues, no micro-stutter, every game I’ve tried has just worked – this includes Older titles such as Oblivion, Fallout 3 NV, X3 – Terran Conflict and Supreme Commander Forged Alliance. I even gave the very demanding Metro 2033 benchmark a go with all settings turned up and managed an average of ~60fps.
In summary I’m dead happy I went SLI, it’s been a pain-free experience, it makes a staggering difference to my frame rates, and the additional noise levels really don’t bother me so much. Actually, my old card is still the noisier one being an older design. Possibly swapping the cards over would make things quieter as well as reduce temperatures a little.
For the record I’ve had over a week of using SLI now and (touch wood) I’ve experienced no issues. It staggers me that I now have waayyy beyond a single GTX 580 in terms of performance. My GPU’s in all cost me £450, but that’s been spaced out over the year and there’s no way I could have gotten such a boost so easily without going SLI for anywhere near that cost. I’m a happy chappy

Hope people enjoyed my rather long post, I hope it will prove of use to those maybe considering a 2nd GPU

Cheers,
Scoob.