8800GTX SLI + DFI Lanparty Expert compatability

Wellcomef3-a

New member
Hi all, ATM I have 2 Gainward 7900GTX video cards that I intend to use with the DFI Expert. I know they will run without issue and at x8 x8 bandwidth in SLI mode.

First of all I would like to hear of members opinions on whether the 2 7900GTXs performance in SLI mode would be beaten by a single 8800GTX running at it's default x16 bandwidth.

I know that the 8800GTX video cards run at x16 bandwidth, but can they run at x8 bandwidth and following on from there could I run 2 8800GTXs in SLI on the same DFI Expert?
 
I wouldnt put an 8800 into an x8 slot TBH. (ie.. run them in SLI at x8 each)

Some of our bench results will show that 7900SLI can and will be beaten by a single 8800 GTX, but to some extent it depends on the app.

:)

K
 
I am not convinced that x8 is any different to x16 slot for current gpu's including the 8800 although Nvidia recommend x16.

The main difference is SLI, not all games benefit and it the non SLI games that you will see the biggest performance leap for the 8800, even in SLI the 7900's will struggle to compete with one 8800GTX (unles they are volt modded and watercooled) - also if you have more processing power you will reach the limit of the 7900's before teh 8800's have finsihed stretching their legs.

Think about power consumtpion, one 8800GTX will ofer the same perforformance as 2x7900GTX SLI in most things but consume less power than the two former cards, SLI 8800GTS 320MB consume around the same amount of power as the SLI set-up you have now.

.

If it were me I would change and see how you get on - if you need more power buy another - but consider that gaming performance of SLI 8800GTS 320MB will be substantial over 1x 8800GTX in SLI friendly games (which there are plenty).

Hope this helps - I know it differs a bit from K404's opinion but its nice to get different opinions so u can make informed decisions.
 
Guys, I am certain I read somewhere that 8800GTXs will run only at x16 bandwidth - the same as what K404 advises. So this rules out running 2 8800GTXs in SLI on the DFI Expert as it's not a true x16 x16 mobo. So what I am left with is either keeping the 2 7900GTXs and running them at x8 x8 bandwidth in SLI mode or exchanging them for a single 8800GTX/Ultra which you both seem to agree has a higher performance and lower power consumption.

It's a damn shame the Expert mobo is hard-wired as it is :sad: Maybe it isn't and the real reason why you cannot run 2 8800GTXs in SLI is the software/programming. After all it does seem a little weird that a single 7900GTX will run at x16 bandwidth with the second 7900GTX running at x2 bandwidth and then when they're connected with the SLI bridge it changes yet again to x8 x8. It's certainly one to think about!

I want to get some (maybe a week or so! LOL) mileage out of these 7900GTXs so for the moment I'll be sticking with them and weighing up the pros and cons and hopefully when I decide to sell - the cost of an 8800GTX will have dropped and be covered by the selling price of the 7900GTXs. Then again there are other considerations.....I really am tempted by the single 8800GTX over the 2 7900GTXs! :D

The advice is appreciated guys and I'll keep you posted how I get on. :)
 
No problem :) If you want any help/info feel free to ask!

It was fairly widely reported that the 8800 series would only run in x16 electrical PCI-E slots.

Theory: assuming any PCI-E slot is fully wired electrically, they will all supply enough power, but the bandwidth would be pretty poor. AFAIK, theres no hardware limitation that stops people using an 8800 in an x8 (or even an x4) slot I think its purely a question of performance.

I wonder how much of the "x16 only" was spread by nVidia to make sure people went out and bought the latest 680I boards.

Another point is: nVidia had the 8800GX2 all ready to go, in case the 2900XT was the better card. That used a single PCI-E slot, so to get a boost, it must use less than x16 bandwidth. Scaling that down, you have to wonder if a single 8800GTS at least would be bothered by an x8 o4 x4 data slot.

My middle name ISNT "conspiracy" BTW, I just think too much :p

I might yet try my 8800 in an x4 slot. Apart from the S-ATA plug hassle, Im not expecting any hardware problems.
 
name='maverik-sg1' said:
also if you have more processing power you will reach the limit of the 7900's before teh 8800's have finished stretching their legs.

So what you're saying here Mav is that the 7900GTXs will hold a system back provided the cpu has sufficient processing power and conversely unless I have a powerful cpu then this (the cpu) will be my bottleneck with an 8800GTX?

K404, yes you have to wonder and speculate whether these hardware designers/manufacturers have a hidden agenda, I learned very early on that these companies have got us by the balls! :@

How are you going to fit an 8800 into a x4 slot!? :? This has got to be a joke, right? 8)
 
I think Kenny ment a pci-e slot that only runs at 4*. Some of the earlier sli/crossfire boards would run one slot at 16* and the other at 4*.

Might be wrong tho.
 
name='Thor' said:
I think Kenny ment a pci-e slot that only runs at 4*. Some of the earlier sli/crossfire boards would run one slot at 16* and the other at 4*.

Might be wrong tho.

The 965 Chipset and it successor (P35 I think) have a full speed x16 and 4x PCI-E slot) only the 975 and Nvidia NF4x16, 590 and 6** have 2 16x PCI-E slots.

Even with that issue there, it is possible to run crossfire (or maybe SLI With hacked driver) on the 965 P35 chipsets - which I believe is what K404 was referring too.

By the way my 8800GTX will run at PCI-E x1, the perfomance is apprximately 15% of the x16 slot - this is a bug on the ASUS P5B if u overvolt the NB in BIOS the slot defaults to x1 - which is where my total belief that a x8 slot not only will work but will also provide sufficent performance to equal or be close enough to a x16 slot to not worry about it.
 
name='K404' said:
It was fairly widely reported that the 8800 series would only run in x16 electrical PCI-E slots.

I might yet try my 8800 in an x4 slot. Apart from the S-ATA plug hassle, Im not expecting any hardware problems.

name='Thor' said:
I think Kenny ment a pci-e slot that only runs at 4*. Some of the earlier sli/crossfire boards would run one slot at 16* and the other at 4*.

I did consider this at first but having not heard of a PCI-E slot with a x4 bandwidth speed it got me slightly confused! 8)

name='maverik-sg1' said:
The 965 Chipset and it successor (P35 I think) have a full speed x16 and 4x PCI-E slot) only the 975 and Nvidia NF4x16, 590 and 6** have 2 16x PCI-E slots.

Ah well now I know. I'm not too familiar with Intel chipset boards. Just a quick thought here mav - how well do the Intel chipsets (and boards for that matter) play along with nVidia chipsets/graphics? In comparison with the AMD s939 processors and NF4 chipset and nVidia graphics which is a perfect combination.

By the way my 8800GTX will run at PCI-E x1, the perfomance is approximately 15% of the x16 slot - this is a bug on the ASUS P5B if u overvolt the NB in BIOS the slot defaults to x1 - which is where my total belief that a x8 slot not only will work but will also provide sufficient performance to equal or be close enough to a x16 slot to not worry about it.

Now that is strange and certainly backs up your belief about 2 8800GTXs in SLI mode running at x8 x8 bandwidth in the DFI Expert. Tell me, could you damage a video card running it at a different bandwidth than it is designed for?

So when you overvolt the NB in the BIOS on the ASUS P5B, when it comes to graphics bandwidth setting in the BIOS the only available setting is x1 or do you know this through other means such as benchmarking the card?

I'll be visiting this forum much more frequently now, I'm learning so much I'd be daft not to! :)
 
You can't damge the car by puttin git in a x8 slot so do not worry bout that - should be fun too.

The P5B defaulting to x1 is when you are trying to squeeze every last mhz of fsb and need more volt to do so - so the card from start up runs at x1 PCI-E I have done it about 80 times trying to balance my overlcok with ultiamte gaming performance - cards are still going strong (7950GX2 and 8800GTX).

Cheers

Mav
 
Mav, I understand about the bandwidth defaulting to x1 but am curious to know how you discovered this. Was it through trial and error, did you read it in the manual, or was it perhaps as I suggested previously the result of benchmarking the cards? Maybe it has been designed this way, one could argue that it hasn't been designed this way and that it's a side effect of overvolting the NB and unintentional but I think that fact isn't true because these board designers and manufacturers are meticulous and have years and years of experience coupled with the fact that these boards must be tested exhaustively before being released for sale. After all, the boards' deficiencies are unlikely to be mentioned in the sales spiel, what!

Do you volt-mod your video cards too? :O (in certain circumstances)

I apologize for getting down and dirty here but how else am I gonna learn!
 
Its a quirk that just happens sometimes- despite what board makers say, they dont give a crap how boards behave at anything other than stock. Overclocking is asking parts to run out-of-spec, so they dont have to support anything to do with it....sadly.

Boards arent tested long enough to see what happens at however many combos of settings there are- a few hours blast at stock then packed and out the door

Theres so many quirks you see popping up over time its untrue.
 
Come to think of it, that is very true K404. Now why didn't I think of that!:$ Yes, when you're overclocking you're on your own - and don't they know it! It's a kop out if you ask me, more and more motherboard manufacturers are making their boards more overclocking friendly and yet like you say they offer no support whatsoever beyond stock settings.

I tend to dig too deeply and end up burying myself! LOL :D Mind you, I don't care if I get it wrong or speculate wildly as long as it stimulates conversation and I get to learn the truth.
 
Your sig says it all ;) I've been at this for a couple years now and im still on first-name terms with the dumb-ass question :D
 
Back
Top