Scoob
New member
Hi all,
I've popped this into the Overclocking sub-forum as I think this will be of interest to us overclockers, who are likely more concerned about power use than most
After some discussions in another thread I thought I’d revisit the whole “how much power does it use” topic. Quite some time back I did some testing using one of those wall socket meters that actually tell you how much power you’re using currently. I’ve not done this for my current rig and today thought I’d do some testing.
It’s in my Sig, but here’s my current gaming PC specification:
CPU: Intel 2500k @ 4.5 (reports 1.35-1.4v depending on load, typically lower, but IBT can see 1.4v)
GPU: Two GTX 570’s in SLI – one Inno3d @ 732 (stock) and one EVGA @ 732 (797 stock for this card)
Motherboard: ASUS P8Z68-V Pro – I use the onboard sound.
Ram: 24gb DDR 3 1600 – 2x4 + 2x8
PSU: Corsair HX750w
Monitor: LG Flatron W2452T – basically your standard older 24” 1920x1200 LCD. No built-in speakers.
Speakers: some old 2.1 Creative Labs jobbies from yesteryear, fairly meaty sub (ON in all tests)
Cooling: Custom External water loop using separate 90w external PSU. D5 Pump.
Note: I am measuring at the socket. Plugged into this socket is my 6-gang power strip that runs: The Main PC, external Cooling Loop, Monitor, Speakers (with Sub) as well as my laptop that’s OFF, but charging.
Test 1: Just the External Loop on, PC is OFF (note: laptop still docked and charging on this circuit)
Power Usage reported: 37w - not too bad, that's a D5 Pump on FULL and 4x 180mm LED fans at 12v.
Test 2: Loop on, PC powered up and sat idle on W7 desktop. Monitor ON.
Power Usage Reported: 230w – note: after a short while this dropped to 205w and stayed there, I assume that’s the various power save features kicking in. So, CPU down to 1.6ghz and GPU’s to their idle state of 50mhz.
Test 3: Loop on, PC powered up and sat at idle on desktop. Monitor OFF as a quick check.
Power Usage Reported: 145w – I guess my old monitor uses 60w then. Sounds about right as it’s a few years old.
For the next few tests, just to be clear, the Loop is ON, the PC is ON, the Monitor is ON and my Speakers are ON. As are my Logitech keyboard (LED’s – keys light up) and my Logitech mouse.
Test 4: Heaven 3.0 – running at 1920x1200, API – DirectX 11, Tessellation – Normal, Shaders – High, Anisotropy – 16x, Stereo 3D – Disabled, Multi monitor – NO, Anti-Aliasing – 8x, Full Screen – Yes, Resolution – System (1920x1200)
Power Usage Reported: 525w Peek – this test pushes the GPU’s nicely, the CPU ain’t so busy.
Test 5: 3dMark Vantage – running at 1920x1200 in Extreme mode. PPU Disabled
Power Usage Reported: 635w Peek – note, I saw no more than 565w peek until the very last feature test.
Test 6: OCCT – CPU Test, Linpack, 25% ram (of my 24gb) AVX enabled.
Power Usage Reported: 228w – that’s just the CPU working hard there, 100% all cores.
Test 7: OCCT – GPU Test, This is Furmark in effect run in full screen mode at maximum complexity.
Power Usage Reported: 674w – we have a new record! Interesting to see, but not really a realistic test in my view.
Test 8: Skyrim gameplay. Note: my Skyrim is beyond Ultra with many many additional graphical enhancements such as textures, meshes and lighting mods. It generally runs at a constant 60fps for the most part, however it does make heavy usage of both GPU’s AND CPU concurrently. This sample was taken as the PEEK value I saw during random gameplay, ALL of it outside, some in open spaces and some in cities & towns. I used god mode along with a conjuration mod to allow me to spawn MANY (30-40) Flame Atronachs and have them fight – this did push the power usage up a little.
Power Usage Reported: 525w Peek – less than I expected. However, I do think I’d be able to beat this figure during regular gameplay potentially, as some areas work the PC that little bit harder. I did do the usual “Go to Dragonsreach and look down into Whiterun” test, as this remains one of the harder scenes in the game from a resource usage standpoint. Add to that the fact I run Improved Whiterun so I have a crap load of extra trees and clutter in that view than standard.
So, in summary, in typical (for me) gaming I never see more than 525w at the wall. Remember, this is power to my PC, my external water loop, my monitor, my speakers as well as my keyboard and mouse of course. Additionally my laptop is on charge in its dock currently. It’s only benchmarks that see usage go over 600w peek. We know 3D Mark Vantage has proven to be a very good test for both CPU and GPU stability, so for that to hit 635w peek is reasonable. OCCT’s own Furmark test however, while good for testing stability, is an unrealistic test in my view, and it did manage to pull the most power at 674w peek – a full 150w more than my Skyrim install that I considered a good real-world test.
I might try Crysis 2 at some point, as that too (if I recall correctly) seems to push both CPU and GPU hard during play.
So, let’s think about this for a moment. I have a good, but fairly moddest 750w PSU. This PSU powers JUST my PC as the monitor, speakers and water cooling loop are externally powered. At the absolute peak power draw we saw of 674w during OCCT’s Furmark test, we can take away the 37w we know my external loop draws and the 60w my monitor draws. That means my PSU is only needing about 580w. Not too bad considering it’s powering an over clocked 2500k and a pair of GTX 570’s! I might bump my GPU clocks up to their 850mhz profile and run some of the tests again if anyone is interested? It’d be interesting to see how much difference it makes – especially in Furmark!
I found this all quite interesting, I wonder if anyone else will? Lol.
Cheers,
Scoob.
I've popped this into the Overclocking sub-forum as I think this will be of interest to us overclockers, who are likely more concerned about power use than most

After some discussions in another thread I thought I’d revisit the whole “how much power does it use” topic. Quite some time back I did some testing using one of those wall socket meters that actually tell you how much power you’re using currently. I’ve not done this for my current rig and today thought I’d do some testing.
It’s in my Sig, but here’s my current gaming PC specification:
CPU: Intel 2500k @ 4.5 (reports 1.35-1.4v depending on load, typically lower, but IBT can see 1.4v)
GPU: Two GTX 570’s in SLI – one Inno3d @ 732 (stock) and one EVGA @ 732 (797 stock for this card)
Motherboard: ASUS P8Z68-V Pro – I use the onboard sound.
Ram: 24gb DDR 3 1600 – 2x4 + 2x8
PSU: Corsair HX750w
Monitor: LG Flatron W2452T – basically your standard older 24” 1920x1200 LCD. No built-in speakers.
Speakers: some old 2.1 Creative Labs jobbies from yesteryear, fairly meaty sub (ON in all tests)
Cooling: Custom External water loop using separate 90w external PSU. D5 Pump.
Note: I am measuring at the socket. Plugged into this socket is my 6-gang power strip that runs: The Main PC, external Cooling Loop, Monitor, Speakers (with Sub) as well as my laptop that’s OFF, but charging.
Test 1: Just the External Loop on, PC is OFF (note: laptop still docked and charging on this circuit)
Power Usage reported: 37w - not too bad, that's a D5 Pump on FULL and 4x 180mm LED fans at 12v.
Test 2: Loop on, PC powered up and sat idle on W7 desktop. Monitor ON.
Power Usage Reported: 230w – note: after a short while this dropped to 205w and stayed there, I assume that’s the various power save features kicking in. So, CPU down to 1.6ghz and GPU’s to their idle state of 50mhz.
Test 3: Loop on, PC powered up and sat at idle on desktop. Monitor OFF as a quick check.
Power Usage Reported: 145w – I guess my old monitor uses 60w then. Sounds about right as it’s a few years old.
For the next few tests, just to be clear, the Loop is ON, the PC is ON, the Monitor is ON and my Speakers are ON. As are my Logitech keyboard (LED’s – keys light up) and my Logitech mouse.
Test 4: Heaven 3.0 – running at 1920x1200, API – DirectX 11, Tessellation – Normal, Shaders – High, Anisotropy – 16x, Stereo 3D – Disabled, Multi monitor – NO, Anti-Aliasing – 8x, Full Screen – Yes, Resolution – System (1920x1200)
Power Usage Reported: 525w Peek – this test pushes the GPU’s nicely, the CPU ain’t so busy.
Test 5: 3dMark Vantage – running at 1920x1200 in Extreme mode. PPU Disabled
Power Usage Reported: 635w Peek – note, I saw no more than 565w peek until the very last feature test.
Test 6: OCCT – CPU Test, Linpack, 25% ram (of my 24gb) AVX enabled.
Power Usage Reported: 228w – that’s just the CPU working hard there, 100% all cores.
Test 7: OCCT – GPU Test, This is Furmark in effect run in full screen mode at maximum complexity.
Power Usage Reported: 674w – we have a new record! Interesting to see, but not really a realistic test in my view.
Test 8: Skyrim gameplay. Note: my Skyrim is beyond Ultra with many many additional graphical enhancements such as textures, meshes and lighting mods. It generally runs at a constant 60fps for the most part, however it does make heavy usage of both GPU’s AND CPU concurrently. This sample was taken as the PEEK value I saw during random gameplay, ALL of it outside, some in open spaces and some in cities & towns. I used god mode along with a conjuration mod to allow me to spawn MANY (30-40) Flame Atronachs and have them fight – this did push the power usage up a little.
Power Usage Reported: 525w Peek – less than I expected. However, I do think I’d be able to beat this figure during regular gameplay potentially, as some areas work the PC that little bit harder. I did do the usual “Go to Dragonsreach and look down into Whiterun” test, as this remains one of the harder scenes in the game from a resource usage standpoint. Add to that the fact I run Improved Whiterun so I have a crap load of extra trees and clutter in that view than standard.
So, in summary, in typical (for me) gaming I never see more than 525w at the wall. Remember, this is power to my PC, my external water loop, my monitor, my speakers as well as my keyboard and mouse of course. Additionally my laptop is on charge in its dock currently. It’s only benchmarks that see usage go over 600w peek. We know 3D Mark Vantage has proven to be a very good test for both CPU and GPU stability, so for that to hit 635w peek is reasonable. OCCT’s own Furmark test however, while good for testing stability, is an unrealistic test in my view, and it did manage to pull the most power at 674w peek – a full 150w more than my Skyrim install that I considered a good real-world test.
I might try Crysis 2 at some point, as that too (if I recall correctly) seems to push both CPU and GPU hard during play.
So, let’s think about this for a moment. I have a good, but fairly moddest 750w PSU. This PSU powers JUST my PC as the monitor, speakers and water cooling loop are externally powered. At the absolute peak power draw we saw of 674w during OCCT’s Furmark test, we can take away the 37w we know my external loop draws and the 60w my monitor draws. That means my PSU is only needing about 580w. Not too bad considering it’s powering an over clocked 2500k and a pair of GTX 570’s! I might bump my GPU clocks up to their 850mhz profile and run some of the tests again if anyone is interested? It’d be interesting to see how much difference it makes – especially in Furmark!
I found this all quite interesting, I wonder if anyone else will? Lol.
Cheers,
Scoob.