7800GTX Review and benches @ Bit-Tech

FarFarAway

New member
Well of course we all know its been released and I thought I'd post what Bit-Tech had with the 7800GTX on their initial review:

These are from before...you've all seen the pics :drools:

7800gtx3.jpg


000.jpg


002.jpg


The GPU is a native PCI-Express graphics processor, and thus has support for SLI – the heatsink at the end of the card is cooling the voltage regulators, and not a HSI bridge chip. SLI support is improving over time, and this driver release sees the addition of another twenty profiles that are predefined by the driver. You can, of course, add your own SLI profiles in order to make games work that don't have an optimised SLI profile created for them just yet.[/IMG]

Well good news on nvidia releasing a driver release with that much change...should make SLi better for all :)

Look how long this beast is!!

They just get longer and longer:

7800review1.jpg


It also seems that nvidia have got the HSF to be 10 x quieter too...apparently Bit-Tech had to turn down the chipset fan on their LANPARTY as this thing doesn't make a noise :)

Bit-Tech's testing method:

Bit-Tech said:
How We Tested:

Please be aware that the way we test our video cards is not a like-for-like comparison, and it is not meant to be. We decided to concentrate on finding the “Best Playable” settings - this means that we're finding the best possible gaming experience delivered on each different configuration. There are no time demos used in our evaluations - we're focusing on the real-world gaming experience, which is, ultimately what should determine your next graphics card's purchase.

System Setup

AMD Athlon 64 FX-55 (operating at 2600MHz - 13x200); DFI LANPARTY nF4 SLI-DR (NVIDIA NForce4 SLI); 2 x 512MB OCZ PC5000 Platinum Series (operating in dual channel with 2.0-2-2-5 timings); Western Digital 200GB Caviar SATA 150 Hard disk drive; OCZ PowerStream 520W Power Supply; Windows XP Professional Service Pack 2; DirectX 9.0c; NVIDIA NForce4 Standalone chipset drivers, version 6.53.

Video Cards:

* 2 x NVIDIA GeForce 7800 GTX - operating at their default clock speeds of 430/1200MHz in SLI mode using Forceware version 77.62.

* 2 x XFX GeForce 6800 GT - operating at clock speeds of 400/1100MHz in SLI mode using Forceware version 77.62 – baseline GeForce 6800 Ultra clock speeds.

* 1 x NVIDIA GeForce 7800 GTX - operating at its default clock speeds of 430/1200MHz using Forceware version 77.62.

* 1 x XFX GeForce 6800 GT - operating at clock speeds of 400/1100MHz using Forceware version 77.62 – baseline GeForce 6800 Ultra clock speeds.

* 1 x ATI Radeon X850 XT Platinum Edition – operating at its default clock speeds of 540/1180MHz using Catalyst 5.6 with Catalyst Control Center.

For the purposes of this article, we overclocked our XFX GeForce 6800 GT's to baseline GeForce 6800 Ultra clock speeds of 400/1100MHz. From now on, these cards will be referred to as GeForce 6800 Ultra to avoid confusion.

The video card drivers on GeForce 6800 Ultra and Radeon X850 XT Platinum Edition were left at their default settings with the exception of Vsync, which was disabled in all cases. On GeForce 7800 GTX, we also enabled Transparency Anti-Aliasing, setting it to 'Multi-Sampling', along with turning Gamma-Adjusted Anti-Aliasing on too.

I'm not going through the playable settings and stuff...that's for you to read and enjoy. Suffice it to say that these babies can crank it up to 1600 x 1400 on highest settings in pretty much all games and deliver the goods 100%

I was most impressed with Farcry and being able to turn on HDR No. 5 @ 1600 x 1400 @ 8xAF....that straing even the best last gen cards and they won't be playable on it.

This thing churns it up...along with Doom 3 on Ultra setting playing a great framerate and not dropping down.

I think nvidia seems to try to go for more architecture than pure speed with its new releases and this is not a diversion from that.

It seems HDR is much better supported in the architecture and IMHO is looking good :)

Summary of playability:

7800review.jpg


Bit-Tech said:
With GeForce 7800 GTX, NVIDIA have built on an architecture that is has now proven that it is flexible and designed with scaling in mind. NVIDIA are looking to scale this architecture for another year or so – at least until Longhorn arrives fashionably late.

By then, we get the impression that NVIDIA will be looking towards a unified shader approach – they will need it when Longhorn arrives, and they're adopting a similar stance to the one that ATI adopted with Shader Model 3.0. In that they will move to a unified shader architecture as and when it is required. It remains to be seen whether ATI will adopt a unified shader approach or a more conventional GPU with R520, but we will place our bets on it being a more conventional GPU for the time being, despite how radical R500, or Xbox 360's GPU, is designed.

Yep...what we all think will happen. The R520 will be a normal part and then both companies will have to go the way of Unified architecture to be able to utilise WGF in Longhorn :rolleyes:

So in sum:

name='Bit-Tech"' said:
In a nutshell, GeForce 7800 GTX is building on the successes of GeForce 6-series with some smart implementations in to the GPU, specifically the pixel shader. It's undoubtedly the fastest video card available to buy at the moment, and we eagerly await the response from the red corner.

Yeah just what we expected from nvidia. They played it safe and made something they can build on and something they know will do very well for what is needed right now to up to a year away.

As previously speculated by the :anisx: News Team:

name='"Bit-Tech' said:
We get the impression that ATI are waiting for final performance numbers on GeForce 7800 GTX before they go ahead and finalise the clock speeds of top-end SKU based around their upcoming R520. By that time, NVIDIA have the option to release a faster part that will more than likely be labelled a GeForce 7800 Ultra, seeing as they've only used a single slot cooling solution that is constructed entirely from Aluminium – we await the arrival of a dual slot copper based cooler once R520 has 'beaten' GeForce 7800 GTX in a few benchmarks.

I think that the battle will continue. I just hope ATI don't hold back too much when they release the R520....meaning nvidia will have to really go some to get the 7800 Ultra to outperform the r520.

So enjoy and someone buy it!! :D

Full article and Thanks to Bit-Tech
 
name='FragTek' said:
hrmmm... lookin' pretty good!

Everyone's moaning "its not that good" and "the r520 will be better" but in reality where is the r520? Where are the ATI cards that are whooping nvidia? nvidia haven't even released their "Ultra" yet and yet the benchies on these cards look immense....I won't even plat at that high res....but HDR with playable frames (more than playable) would be awesome :worship:
 
name='Dave' said:
my guess is frag. it has to be next on his upgrade list

Nope! Redline is next on my list :) Then prolly my Vapochill.... Then either a 7800 or another 6800 to SLi.
 
name='FragTek' said:
Nope! Redline is next on my list :) Then prolly my Vapochill.... Then either a 7800 or another 6800 to SLi.

Yeah PCP&C has been bought by me so my list goes:

6800GT for SLi....then X2 hopefully :)

Gonna wait for the Ultra to come out before getting the next gen ;) Then I won't need SLi any more :D
 
so then in 2 or 3 years when these wil be avaldable for £80 i might just be able to aford one... the only problem is that i cant plug my moniter into it!
 
Back
Top