Optimal Clock Settings for 7800's

maverik-sg1

New member
Information provided by HeavyH2o over at XS:

After some testing, the best points on the core clock of the nVidia 7800 series cards are whole multiples of the core frequency oscillation of 27 MHz. That calculates to the following root core clocks:

(17 x 27) - 40 = 419

(18 x 27) - 40 = 446

(19 x 27) - 40 = 473

(20 x 27) - 40 = 500

(21 x 27) - 40 = 527

(22 x 27) - 40 = 554

(23 x 27) - 40 = 581

And, here is how a single MHz below the whole number threshold affects 3Dmark05 scores:

445 8052

446 8222

472 8404

473 8508

499 8669

500 8753

Now, what about the other two cores. How are those controlled?

Well, when you adjust the root clock, you actually adjust all three clocks.

The root clock gets bumped by 40 MHz to represent the real geometry or vertex core clock. So, lets pick 450 core, which so many vendors use. 486 is the best vertex clock value (18*27), which translates to 446 root (486-40). So, we would actually have 490 using a 450 root clock, a 4 Mhz pad on the required frequency.

But, the shader and ROP cores start at 415 (really 418.5) in low power, and jump in whole frequency values (ie, 17 x 27 = 459) to a value that is less than 13.5 MHz over or equal to the root frequency. In this case the root frequency is 450 which is 18 MHz over 432 (16 x 27) and 9 MHz shy of 459 (17x 27), so 459 is the ROP/Shader clock.

So, a little confusing, but you do have some control over the other two clocks.

So, here is a list of primary - vertex based (red) and secondary - ROP and shader based (blue) root clock OC targets. The red will yield the larger gain, and you will see a second, albeit smaller gain on the blue clocks where the optimal memory frequency matches the root clock.

419

432 - the nVidia spec root clock

446 - so many vendors use 450

459 - why BFG uses 460

473

486 - why eVGA, XFX, and BFG use 490 on limited cards

500

513

527

540 - How I got 9539 on 3DMark05 with a lowly 3200+ and one 7800 GTX
wink.gif


554

567

581

594

608

621
 
Some Information GT specific

Thanks to John at XS:

If anyone owns a 7800GT like me here's some info (all being MHOs)
smile.gif


MHz in RT <--> 3DM05 score <--> Core/Shader/ROP/MEM in RT monitor

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

400 - 6928 - 441, 405, 405. 501.19

401 - 6987 - 441, 405, 405, 501.19

402 - 6981 - 441,405,405,501.19

403 - 6977 - 441,405,405

404 - 6983 - 445.5, 405,405

405 - 6994 - 445.5,405,405

406 - 7015 - 445.5,405,405

407 - 7027 - 445.5

408 - 7025 - 450,405,405

409 - 7022 - 450,405,405

410 - 7045 - 450,405,405

411 - 7009 - 450,405,105

412 - 7093 - 450,418.5,418.5

413 - 7100 - 453.6,418.5,418.5

414 - 7127 - 455.63, 418.5,418.5

415 - 7130 - 455.63,418.5,418.5

416 - 7127 - 455.63,418.5,418.5

417 - 7148 - 459,418.5,418.5

418 - 7137 - 459,418.5,418.5

419 - 7125 - 459,418.5,418.5

420 - 7125 - 459,418.5,418.5

421 - 7142 - 459,418.5,418.5

422 - 7165 - 464.06,418.5,418.5

423 - 7154 - 464.06,418.5,418.5

424 - 7163 - 465.75,418.5,418.5

425 - 7178 - 465.75,418.5,418.5

426 - 7263 - 468,432,432 <--- Big jump in shader/ROP domain = better score!

427 - 7243 - 468,432,432

428 - 7224 - 468,432,432

The numbers pretty much sum it all up. Tested on my LeadTek 7800GT, threshold set at 40. You have to find the right general core frequency at which the shader/ROP change singnificantly.

I suggest you fire up ATI Tool, start the 3D window, and work yourself up from 400 --> xxx MHz and check when the shader/ROP domain frequency is increased the most, that's the frequency you will get the best performance. You don't need 3DMark and waste 4,5mins for each MHz like I did
biggrin.gif


And setting a higher threshold IMHO means you will get higher shader/ROP frequency at lower core geometric speeds which again means higher scores.. look at the difference between the frequency I set in RT and the freq the RT monitor is reading out --> it's around 40, the threshold that you can set in the BIOS (I suggest using the latest NiBiTor to do that...)
 
AMEN! Nice one Mav! Will REALLY devour al that info later on :)

Is it really ok to use ATI tool to mess around with these core clocks?
 
Other than google, does anyone have info on this for a 6800GT? I would hate to think that running at 410 is worse than 409Mhz for the core.
 
name='chris_ah1' said:
Other than google, does anyone have info on this for a 6800GT? I would hate to think that running at 410 is worse than 409Mhz for the core.

Try it and report back - it seem logical that there is a multplier, you just have to find it, either by trying or by finding.

I think I read somewhere once that the multiplier was 23 - use that as your base and give it a go.
 
ok, so my gtx stock clocked at 450/1250 would be better running at 454/?...what memory clocks are best? I got a little confused reading the explanation...iv always been a maths idiot
redface.gif


id like to give this a go and see if this is right...if true i can feel a custom bios coming on... :D
 
I don't get what it's saying either. 513 was no different to 520, at least in 2 03 runs it was within margin of error, and 05 gave me 200 less.
 
I tried AtiTool to check thresholds etc, and it didnt really let me do anything :( (I saw that coming, but tried anyway) is there any other software that`ll let me tweak the right things for ROP/Shader etc?

Kenny
 
K404 said:
I tried AtiTool to check thresholds etc, and it didnt really let me do anything :( (I saw that coming, but tried anyway) is there any other software that`ll let me tweak the right things for ROP/Shader etc?

Kenny

Hi Kenny - it's intersting that you should say that - I'll have a look around, is it possible that riva 15.7 allows you to do that or does it just mean that set the clocks that you can set and these other ROP etc... fall into place?

424 - 7163 - 465.75,418.5,418.5

425 - 7178 - 465.75,418.5,418.5

426 - 7263 - 468,432,432 <--- Big jump in shader/ROP domain = better score!

427 - 7243 - 468,432,432

428 - 7224 - 468,432,432

Mav
 
Riva lets me set core clock and RAM (without checks...hurrah!) But I cant see what the Vertex and ROP clocks are doing unless I open the hardware monitor page. I can only open one instance of Riva at a time so can view both pages simultaneously, and the hardware monitor only seems to show 2D results?

From the off, I couldnt see how or why an ATi program would do the business with nVidia, but it worked for the guy in the first place?

K

EDIT: I`ve seen the complete thread at XS..looks like my BIOS doesnt have a delta setting, which would explain most of it :)

Monday Morning Update: I got a Delta now though! Muahahahaha
 
Back
Top