VRAM Useage tests

I like how when you max out all the setting in UT2004 it says insane , 1680 x 1050 isn't insane anymore!
 
Just tried battlefield 3 out for myself, settings all on low or off, resolution at 5760*1080 with my sli gtx 460 1gb

Highest i saw it go was about 953mb

so

Battlefield 3 Frostbite 2, 5760x1080, Settings all on low or off, (SLI GTX460 1GB), 935mb+
 
Just played a bunch of games with my 6850 and i got a variety of results. All of the games i ran on Ultra settings, everything max out completely and V-Sync turned off in all games too.

First game is APB:Reloaded

7060841617_a6d8244a4b_b.jpg


Nice parking on a ticket booth...

7060841409_f9cc2a9290_b.jpg


APB is very demanding, especially for a DX9. I get fps spikes sometimes in this game, but my 6850 keeps it at the 60FPS cap most of the time.

Counter Strike Global Offensive Beta

6914759844_47edd1d0df_b.jpg


VICTORY!

7060840773_8800686137_b.jpg


CS:GO is not very demanding but it does use up a fair bit of VRAM.

Deus Ex: Human Revolution

6914759298_b24d78eae2_b.jpg


SELF DESTRUCTION

6914759210_45819da910_b.jpg


Strangely it seems DX:HR doesn't use much VRAM at all, only using around half of my 1GB available.

One of my favourites next; Killing Floor

7060840265_7ed0934e3e_b.jpg


90FPS cap
frown.gif
As you can see it's not a demanding game (it is DX9 after all...) but it does use more VRAM than Deus Ex.

7060839993_d6b543f2ab_b.jpg


The boss guy...

6914758514_fcce93f1d3_b.jpg


58% GPU usage in the image above. Damn, the fan speed is higher than the GPU usage!
laugh.png


Saints Row The Third, a game which my 6850 struggles with at Ultra settings mostly sticking around 25-35FPS on ultra. Turned down to high it reaches above the 40-45FPS

6914758272_86ac56ffcf_b.jpg


7060839331_ce38fe26e6_b.jpg


As you can see this crazy game uses as much VRAM as it can get its hands on.

Now, errr, Terraria?

6914757824_a92c8592c0_b.jpg


7060838851_5cc6783fb5_b.jpg


Yeah... Not much to say here... Imagine if there was no FPS cap...
ohmy.png


Next is Test Drive Unlimited 2, which my 6850 just about copes with even though it's a DX9 game, but my guess is its a poor console port.

7060838759_8302f3c453_b.jpg


On the move!

7060838455_f40f9cc8a2_b.jpg


FPS jumps right up once you're out of the driving seat.

7060838333_4ee2435d1d_b.jpg


TDU2 uses all of my VRAM too.

Finally we have Skyrim, which thanks to the latest AMD drivers is messed up when AA is switched on. You have to have it off else you get this...

6914756932_596cafddd3_b.jpg


And this...

7060837829_79dd7cde00_b.jpg


And this...

7060837663_112bbca50a_b.jpg


Damn it AMD!
amdwar.gif
Apart from the crappy driver issues we see Skyrim also uses a fair bit of VRAM. Also, FPS is very consistent too and stays at the 60FPS cap.
 
Deep Black, Highest, 1920x1200, 480 1.5G, 320MB+



Recurring Evil, Highest, 1024x768, 480 1.5G, 443MB+

l_recurringevil_2012_04_06_16_16_51_923.jpg


Total War: Shogun 2, Highest, 1920x1200, 480 1.5G, 947MB+



Ridge Racer Unbounded, Highest, 1920x1200, 480 1.5G, 439MB+

 
If I have time might play around with the onboard 6310 and whether it has 256/ 512/ 1024MB system memory allocated in the BIOS.

E350 @1.68Ghz + 7900GS @570/ 720Mhz tests run with F@H SMP in background -----------------------------------------------

GEOMETRY WARS . AA ENABLED . 1024x768

Never exceeded 77MB

30-40% GPU usage

GRID . LOWEST DETAIL . 4XMSAA . 1024X768

254MB constant, max 256MB - 180MB in menus

97-98% GPU usage

--- FRAMERATES SMOOTH/ PLAYABLE ---
 
Hard Reset, All maxed, 1920x1200, (GTX480 1.5G), 779MB+

s_hardreset_2012_04_10_12_26_57_855.jpg


Rage, All Maxed, 1920x1200, (GTX480 1.5G), 1242MB+

s_rage_2012_04_10_00_06_06_860.jpg


World of Warcraft, All Maxed - Dx11, 1920x1200, (GTX480 1.5G), 761MB+

s_wow_2012_04_10_12_18_14_686.jpg
 
I have been reading through a thread on Kepler over at Aria and there is a guy on there that has a 680 and is running a triple screen setup @ [font=Verdana, Arial, Tahoma, Calibri, Geneva, sans-serif]5760x1080.[/font]

[font=Verdana, Arial, Tahoma, Calibri, Geneva, sans-serif]Even on high settings in BF3 he is only using 1.6gb of Vram, he can't run ultra because the card just isn't powerful enough. He also mentions in another post that 99.9% of other games barley use 1.5gb.[/font]

[font=Verdana, Arial, Tahoma, Calibri, Geneva, sans-serif]Here is the thread, the guys name is Seb.F read from pages 147 - 149[/font]

http://forums.aria.c...er-Chat/page147

Pretty interesting and shows that Vram isn't a big of a deal as people make out, seems so far that only BF3 is a Vram demanding game.

There's a thought around that games pickup on how much memory a gfxcard has and adapts accordingly. Some game engine very probably do, but I'd hesitantly say they were designed for older games where dropping to the use of DDR via Vista-esq expansion of graphic memory, was a bad idea.

I've seen, although not many posts on here, where newer game engines flaunt to use of more memory than the gfxcards have onboard. Chances are the drop into DDR2 or 3 isn't that much of a big deal. Again, possibly using it strategically.

Speculation, game writers could only verify that as opposed to word on the street.

Having enough grunt to process it, that's another thing.
 
another interesting thing to note is that my friend who is running 2xGTX295s, which has 1GB of RAM per GPU can run BF3 at max settings when I somehow need 1.6GB to run it that high

Let's have some of his screenies.
tongue.png


There's word around that Afterburner, depending on the version maybe, gets confused with xfire/sli memory counts. However, I've only heard of it reporting too-much as opposed to not enough. Weird.
 
sorry I dont know how much VRAM he's using, as he doesn't use afterburner.

Turns out he only uses one 295, but still he can run the game at max with constant FPS, which obviously shows he's not having massive FPS drops due to a lack of VRAM
 
Let's have some of his screenies.
tongue.png


There's word around that Afterburner, depending on the version maybe, gets confused with xfire/sli memory counts. However, I've only heard of it reporting too-much as opposed to not enough. Weird.

I think it might be reporting 2 x 800MB with the cards in sli.
 
As promised here are some screenshots of NVIDIA Surround at 8120x1600 - Due to the screenshots sizes I will not link them directly you can just click on them to view them larger.

BF3 @ 2560x1600: http://dl.dropbox.com/u/7299662/Surround_Screenshots/bf3_2012_04_12_23_21_04_300.jpg

BF3 @ 8120x1600: http://dl.dropbox.com/u/7299662/Surround_Screenshots/bf3_2012_04_12_23_21_32_328.jpg

At 8120x1600 I was getting about 2 FPS and everything was appearing and dissapearing including my gun which as you can see has dissapeared in the 2nd screenshot. It was flashing on and off about every other second. Same thing happens even if I set all the settings to High or Medium. I have to set almost every setting to Low for it to actually be playable.

TF2 @ 8120x1600 (Settings): http://dl.dropbox.com/u/7299662/Surround_Screenshots/hl2_2012_04_12_23_16_17_713.jpg

TF2 @ 8120x1600 (Gameplay): http://dl.dropbox.com/u/7299662/Surround_Screenshots/hl2_2012_04_12_23_16_52_121.jpg

This did fair a bit better obviously the memory usage is still the maximum but I got 60FPS. It however was not playable because it had this kind of stuttering effect. It felt like you were walking in custard and thus isn't possible to be played. If I take the AA down to nothing it is completely playable however.

If there are any other games you want to see that I have let me know. Although I think pretty much everything I play at this resolution will use 1.5GB on my cards (their maximum memory amount). Even TF2 a 6 year old game needs tons of memory at this res. When I play it at 2560x1600 using those same settings I get a smooth 300FPS (the games maximum if you don't remove the FPS Limiter) and it plays great.
 
I'll do it when i get home tonight !, 1920 x 1080 ultra settings in say the most graphical encounter i know "Ultraxion 25man" hehe my GTX 580 only pulls rougly 30-45 fps on that boss so will be interesting to see how much VRAM it uses ! I'll also do a normal case scenario questing in the wilderness and doing laps of SW.

I'm afraid i cannot do this, World of Warcraft see's the OSD as a cheat / hack and as soon as i enable it the game crashes to desktop with an criticle error. only way i can think of doing it is setting up my own offline private server.
 
I'm afraid i cannot do this, World of Warcraft see's the OSD as a cheat / hack and as soon as i enable it the game crashes to desktop with an criticle error. only way i can think of doing it is setting up my own offline private server.

.. how weird, worked for me as you can see in the list in the OP.
 

The info you're giving here is vitally important to this comparison test.

The fps (I think personally) is another study.

The extreme resolutions, 8000+ pixels (can't believe I just typed 8000 for pixel width), is right around the top resolution we could possibly theorise as the highest for memory use in games. Being able to compare the same game at 1920x1080 and 8120x1600 gives us an idea of if vram scales as we increase our resolutions. My personal thoughts were that all the textures etc will be loaded no matter what resolution you use, and the only extra would be for frame reproduction, which "shouldn't" be that much. The more results will prove the theory.

.. still can't get over 8000+ resolution lol. I bait anyone with an eyefinity and 6 screens to post their info.

The fps, as far as I'm concerned, is the processing grunt to turn the textures into a playable environment over the resolution. With that in mind, perhaps we could theoretically have 2x470 (with 480 size memory) cards over the same resolution, be able to show the screen, use the same amount of memory, but be on 0.5 fps where 480s do 2fps. Point being the memory usage is the same, the grunt to process over the resolution is not enough.

Great stuff, will process the results in a bit.
 
Even with an Eyefinity 6 screen setup you won't get much higher resolution than this as they can't use 30" displays in that setup. 6 Way Eyefinity is limited to displays of a resolution each of 1920x1200 or 3 displays at 2560x1600.

6x 1920x1200 = 5760x2400 = 13,824,000 Pixels

3x 2560x1600 = 7680x1600 = 12,288,000 Pixels

3x 2560x1600 = 8120x1600 = 12,992,000 Pixels <- With Bezel Compensation

So it's almost the same, I have wider but they have taller.

EDIT:// Actually I just looked it up and it seems you can do 30" with 6 way Eyefinity but it requires the use of multiple cards due to all the bandwidth (each display uses a dual link DVI connector). That would be like 24 Million pixels
biggrin.png
 
The info you're giving here is vitally important to this comparison test.

The fps (I think personally) is another study.

The extreme resolutions, 8000+ pixels (can't believe I just typed 8000 for pixel width), is right around the top resolution we could possibly theorise as the highest for memory use in games. Being able to compare the same game at 1920x1080 and 8120x1600 gives us an idea of if vram scales as we increase our resolutions. My personal thoughts were that all the textures etc will be loaded no matter what resolution you use, and the only extra would be for frame reproduction, which "shouldn't" be that much. The more results will prove the theory.

.. still can't get over 8000+ resolution lol. I bait anyone with an eyefinity and 6 screens to post their info.

The fps, as far as I'm concerned, is the processing grunt to turn the textures into a playable environment over the resolution. With that in mind, perhaps we could theoretically have 2x470 (with 480 size memory) cards over the same resolution, be able to show the screen, use the same amount of memory, but be on 0.5 fps where 480s do 2fps. Point being the memory usage is the same, the grunt to process over the resolution is not enough.

Great stuff, will process the results in a bit.

Thats how I believe it works, legit reviews just did a series of articals on the 680,680 SLI, 680 in surround, and 680 SLI in surround, and the scaling was the same no matter what. The textures are already loaded into VRAM you just gotta acess it more often as each screen adds to VRAM acess time, so running out of VRAM doesn't seem to be as big of an issue as running out of VRAM bandwidth(or just too slow runnung VRAM when compared to the proccecing cores)

I was still hoping to see alittle more scaling with 2 cards, because with 1 card and 3 screens, you only have one VRAM system serving up all three screens, so you expect to see the fps tank a bit, but with 2 cards and 3 screens, I was expecting 2 VRAM systems serving up 3 monitors to run away a bit more..than it did..
 
Back
Top