Shadow of Mordor requires 6GB of VRAM?

WYP

News Guru
It has been revealed that Middle-Earth: Shadow of Mordor will require 6GB of VRAM for Ultra settings at 1080p.

29041022980l.jpg


Read more on Middle-Earth: Shadow of Mordor's PC requirements here.
 
I prefer the words, can use 6GB, because it certainly doesn't need it that's for sure.
 
I've pre-ordered this last week, and watched loads of footage over the weekend. While it looks pretty.. 6GB VRam sounds daft! lol
 
oh! my 980 is obsolete, hasnt even been delivered yet!

Im assuming it needs the vram because the textures arent optimized!
 
oh! my 980 is obsolete, hasnt even been delivered yet!

Im assuming it needs the vram because the textures arent optimized!

No, they are giving you the option to use high resolution textures. It doesn't NEED that much VRAM... It runs on a console! :lol:
 
damn 6GB of VRAM for perfect Ultra settings at 1080p! gotta low the setting a bit to play at 1080p decently with 4GB of VRAM.
 
I find it hard to believe that this game is going to ever be displaying a scene at 1080p that has 6gb of textures in it at any one time - even caching ALL textures etc. in a level it seems extreme. 100% lazy console port. Just because consoles work a little differently memory-wise, they're being lazy by not making the port work how PC's work.

This a shame as I'd hope to see BETTER ports now both XBOX One and PS4 are basically just entry-level gaming PC's in effect. Hope this doesn't start a trend of ports needing artificially high system requirements. Plus it doesn't bode well for ports in general as what other corners are they going to cut?

Scoob.
 
I find it hard to believe that this game is going to ever be displaying a scene at 1080p that has 6gb of textures in it at any one time - even caching ALL textures etc. in a level it seems extreme. 100% lazy console port. Just because consoles work a little differently memory-wise, they're being lazy by not making the port work how PC's work.

This a shame as I'd hope to see BETTER ports now both XBOX One and PS4 are basically just entry-level gaming PC's in effect. Hope this doesn't start a trend of ports needing artificially high system requirements. Plus it doesn't bode well for ports in general as what other corners are they going to cut?

Scoob.

Welcome back Scoob, long time no see.

I'd probably advise to wait and see before claiming it's a lazy console port though ;)
 
hopefully it will be something like having really long draw distances so you can see an enemy army from miles away all "rendered up" with none of the texture drop ins :)
 
Welcome back Scoob, long time no see.

I'd probably advise to wait and see before claiming it's a lazy console port though ;)

Hey SPS, thanks.

Well, I said "lazy" but equally we know things work a little differently now with console memory architecture, despite their firm "PC" roots. I fully expect "ultra" settings on newer games to be asking for 3 or 4gb VRAM (leaving me likely at just "high" settings with my 2gb) but 6gb does seem a bit silly when we're only talking "TV" resolutions of 1080p after all.


I suspect that at launch this title will be overly demanding of VRAM so only a very small percent of PC gamers will be able to view the game in its full glory. However, I'd equally expect future updates to reduce this higher VRAM requirement, seeing gamers with "only" 3-4gb of VRAM being able to enjoy the higher settings.

On the face of it - having not pulled apart any code - it does look like a case of "let's just store everything here, until we need it" (here being VRAM) rather than the more (PC) conventional drive > RAM > VRAM as needed scenario.

We know our GPU's cache data anyway, so I might see near 2gb VRAM used in a particular area in say Skyrim after I've been playing for a while. But, saving and reloading that very scene would see ony 800mb in use as 1.2gb of data was cached "just in case" it's needed again.

Now cached data can certainly increase the smoothness of a game as no loading from Ram or drive is needed. So, I do wonder if users with less than 6gb VRAM who "force" (if this is even possible) the higher texture resolutions will see excessive stutter for example as texture data is swapped out by the GPU. Either way, hopefully things can be updated to be more PC friendly. We shall see.

Final thought: I do wonder if, in part contrary to what I have said above, future console titles will be EPICALLY high-res texture heavy? Think about it, good as they are, the current gen of consoles are still only entry-level gaming PC's CPU and GPU-wise. What they DO have is this new memory architecture. This means it's "easy" to bump up the texture resolution beyond what is currently seen on PC's without being concerned by lowest common denominator VRAM limits of the PC. Now, we know a powerful CPU is needed to prepare the detailed geometry in many modern games, because much of that part of the visuals is still CPU-bound. So, will future titles have less geometrically complex visuals (simple 3D models) but use the higher texture resolution potential to "hide" this? Basically catering for the "weaker" Console CPU constraints, but the "ample" VRAM?

Oh, my next GPU's, needless to say, will have BIG VRAMs! lol.

Anyway, just my random pondering while I procrastinate prior to doing some actual work today lol.

Scoob.
 
Don't forget that just because it's 1080p, doesn't mean they can't just put in fully high res textures / original assets and stuff. while higher res does = more memory required.. if you put in mental textures (which it sounds like ultra is) then you're gonna need more vram.

That said, I've been mucking about on The Vanishing of Ethan carter and that looks like a friggin photo/movie.. utterly insanely beautiful game.
 
Hey SPS, thanks.

Well, I said "lazy" but equally we know things work a little differently now with console memory architecture, despite their firm "PC" roots. I fully expect "ultra" settings on newer games to be asking for 3 or 4gb VRAM (leaving me likely at just "high" settings with my 2gb) but 6gb does seem a bit silly when we're only talking "TV" resolutions of 1080p after all.


I suspect that at launch this title will be overly demanding of VRAM so only a very small percent of PC gamers will be able to view the game in its full glory. However, I'd equally expect future updates to reduce this higher VRAM requirement, seeing gamers with "only" 3-4gb of VRAM being able to enjoy the higher settings.

On the face of it - having not pulled apart any code - it does look like a case of "let's just store everything here, until we need it" (here being VRAM) rather than the more (PC) conventional drive > RAM > VRAM as needed scenario.

We know our GPU's cache data anyway, so I might see near 2gb VRAM used in a particular area in say Skyrim after I've been playing for a while. But, saving and reloading that very scene would see ony 800mb in use as 1.2gb of data was cached "just in case" it's needed again.

Now cached data can certainly increase the smoothness of a game as no loading from Ram or drive is needed. So, I do wonder if users with less than 6gb VRAM who "force" (if this is even possible) the higher texture resolutions will see excessive stutter for example as texture data is swapped out by the GPU. Either way, hopefully things can be updated to be more PC friendly. We shall see.

Final thought: I do wonder if, in part contrary to what I have said above, future console titles will be EPICALLY high-res texture heavy? Think about it, good as they are, the current gen of consoles are still only entry-level gaming PC's CPU and GPU-wise. What they DO have is this new memory architecture. This means it's "easy" to bump up the texture resolution beyond what is currently seen on PC's without being concerned by lowest common denominator VRAM limits of the PC. Now, we know a powerful CPU is needed to prepare the detailed geometry in many modern games, because much of that part of the visuals is still CPU-bound. So, will future titles have less geometrically complex visuals (simple 3D models) but use the higher texture resolution potential to "hide" this? Basically catering for the "weaker" Console CPU constraints, but the "ample" VRAM?

Oh, my next GPU's, needless to say, will have BIG VRAMs! lol.

Anyway, just my random pondering while I procrastinate prior to doing some actual work today lol.

Scoob.

I have discussed how textures are managed and the bandwidth bottleneck of RAM to VRAM in another thread on here recently but just a few points I want to mention for you to ponder on:

  • Is this title using a deferred renderer? If not why does the client resolution dictate the texture sizes required?
  • What's to say that the game doesn't look amazing on high?
  • Does VRAM space specifically refer to just texture storage? What about large geometry buffers or structured buffers using DirectCompute?
  • How about indirect lighting quality for global illumination using volume textures or volume-based rendering approaches. These are usually much more subtle.
  • Most PC rendering systems are built to be scalable to accommodate a variety of specifications, what's wrong with providing an option to scale this system to its highest capability should someone have the capacity to run it?
 
(FYI SPS works for a games company :P)

(just in-case you hadn't noticed)

(oh and he does game engine writing wizard shit too)

(massive respect)
 
Now, we know a powerful CPU is needed to prepare the detailed geometry in many modern games, because much of that part of the visuals is still CPU-bound.
Data CPU is sending to GPU, is no-where near the ammount the latter one is operating on.
And since when Tesselation for example (feature that can increase polygons many many times over), requires THAT much CPU power ?
 
(FYI SPS works for a games company :P)

(just in-case you hadn't noticed)

(oh and he does game engine writing wizard shit too)

(massive respect)

Awesome thing to know about you SPS. Definitely don't understand any of that stuff but hugely interesting and as SuB says "massive respect".
 
I have discussed how textures are managed and the bandwidth bottleneck of RAM to VRAM in another thread on here recently but just a few points I want to mention for you to ponder on:

  • Is this title using a deferred renderer? If not why does the client resolution dictate the texture sizes required?
  • What's to say that the game doesn't look amazing on high?
  • Does VRAM space specifically refer to just texture storage? What about large geometry buffers or structured buffers using DirectCompute?
  • How about indirect lighting quality for global illumination using volume textures or volume-based rendering approaches. These are usually much more subtle.
  • Most PC rendering systems are built to be scalable to accommodate a variety of specifications, what's wrong with providing an option to scale this system to its highest capability should someone have the capacity to run it?

You know, I did actually wonder some of that. Possibly I/we are looking at things with too much (unintentional) bias because we don't fully understand what's going on here.

I've seen things rendered at much higher resolutions but only displayed at a more common 1080p - it may sound silly, but it really does make quite a difference to the display quality.

The key is, as you state, exactly what are they putting in the VRAM in this instance? We're used to the whole "big texture = big VRAM" thing, but there's doubtless much more to it.

For example. Playing around with Skyrim using 8k textures (and enhanced model geometry) on THREE 1920x1080 screens in 3D, well, 2gb VRAM seems to cope ok. Not saying more wouldn't help though.

At the end of the day, if more VRAM - as in a large jump this gen - is where we're going then so be it. It's a shame if it makes otherwise very powerful hardware "obsolete" (at least for extreme settings) but that's the way of things. Maybe in the future VRAM matching System ram in quantity will be the norm. I remember the days when I had GPU which came with expansion slots for additional VRAM lol, can't see that happening again though :)

I think your final point is in many ways the most interesting. Any console title developer knows they have a fixed hardware base for maybe the next seven years, so they're going to leverage whatever they can from that to make the best looking games possible. Equally I imagine they'd resist doing too much re-work porting this to the less-profitable (often the case) PC platform. However, if they ARE spending the time then why not put something extra in for those with epic systems? I love it when games have settings that I know a single GPU wouldn't be able to handle on higher settings, however now it's VRAM rather than pure GPU grunt that's needed I'm left out lol.

So, the key question remains really, is this a "lazy" port as things are coded to work in a "console" way, meaning the title is an unnecessary resource hog. OR is this the developer genuinely offering something more for high-end system owners?

Interesting times regardless, if this does indeed become the new trend.

Oh yeah, want to add that I'm not a "every must be MAXXXED" visuals whore, rather I traditionally tune to get a steady 60fps - though I have been spoilt since first going SLI a few years ago, I admit :)

Scoob.
 
I've seen things rendered at much higher resolutions but only displayed at a more common 1080p - it may sound silly, but it really does make quite a difference to the display quality.

Are you referring to some form of supersampling? If were talking rendering to a larger buffer then that has performance costs as well as memory costs.

The key is, as you state, exactly what are they putting in the VRAM in this instance? We're used to the whole "big texture = big VRAM" thing, but there's doubtless much more to it.

I usually prefer to give the developers the benefit of the doubt as opposed to saying "they've just used large uncompressed textures because the can't be arsed for PC" because in my experience it's simply not true to treat every dev like that.

For example. Playing around with Skyrim using 8k textures (and enhanced model geometry) on THREE 1920x1080 screens in 3D, well, 2gb VRAM seems to cope ok. Not saying more wouldn't help though.

Again we don't know the internals of the engine so we can't assume that this data is not being streamed.

However, if they ARE spending the time then why not put something extra in for those with epic systems? I love it when games have settings that I know a single GPU wouldn't be able to handle on higher settings, however now it's VRAM rather than pure GPU grunt that's needed I'm left out lol.

Well if the engine doesn't accommodate it, it's usually not very cost effective for the amount of people that do have those systems available. I honestly think this new VRAM thing is simply because it's very cheap to get a setting to load in higher res textures/buffers - it's easily scalable.
 
I was talking about render resolution being higher than display resolution for increased visual quality - saw some of this demonstrated recently and was surprised by the difference. Needed a chunk more GPU power and VRAM of course.

There are a few too many unknowns currently, so I'm going to wait and see how things turn out - it'll be interesting regardless.

These new "need 6gb VRAM for best visual quality" games might be game changers, or it may just be a case of 10% "better" visuals for 50% more VRAM. Like with AA every additional increase makes things better, but returns are diminishing and performance can tank.

Happy to wait and see how SoM reviews and maybe how it performs / looks first hand if I decide to buy.

Scoob.
 
Back
Top