I have discussed how textures are managed and the bandwidth bottleneck of RAM to VRAM in another thread on here recently but just a few points I want to mention for you to ponder on:
- Is this title using a deferred renderer? If not why does the client resolution dictate the texture sizes required?
- What's to say that the game doesn't look amazing on high?
- Does VRAM space specifically refer to just texture storage? What about large geometry buffers or structured buffers using DirectCompute?
- How about indirect lighting quality for global illumination using volume textures or volume-based rendering approaches. These are usually much more subtle.
- Most PC rendering systems are built to be scalable to accommodate a variety of specifications, what's wrong with providing an option to scale this system to its highest capability should someone have the capacity to run it?
You know, I did actually wonder some of that. Possibly I/we are looking at things with too much (unintentional) bias because we don't fully understand what's going on here.
I've seen things rendered at much higher resolutions but only displayed at a more common 1080p - it may sound silly, but it really does make quite a difference to the display quality.
The key is, as you state, exactly
what are they putting in the VRAM in this instance? We're used to the whole "big texture = big VRAM" thing, but there's doubtless much more to it.
For example. Playing around with Skyrim using 8k textures (and enhanced model geometry) on THREE 1920x1080 screens in 3D, well, 2gb VRAM seems to cope ok. Not saying more wouldn't help though.
At the end of the day, if more VRAM - as in a large jump this gen - is where we're going then so be it. It's a shame if it makes otherwise very powerful hardware "obsolete" (at least for extreme settings) but that's the way of things. Maybe in the future VRAM matching System ram in quantity will be the norm. I remember the days when I had GPU which came with expansion slots for additional VRAM lol, can't see that happening again though
I think your final point is in many ways the most interesting. Any console title developer knows they have a fixed hardware base for maybe the next seven years, so they're going to leverage whatever they can from that to make the best looking games possible. Equally I imagine they'd resist doing too much re-work porting this to the less-profitable (often the case) PC platform. However, if they ARE spending the time then why not put something extra in for those with epic systems? I love it when games have settings that I
know a single GPU wouldn't be able to handle on higher settings, however now it's VRAM rather than pure GPU grunt that's needed I'm left out lol.
So, the key question remains really, is this a "lazy" port as things are coded to work in a "console" way, meaning the title is an unnecessary resource hog. OR is this the developer genuinely offering something more for high-end system owners?
Interesting times regardless, if this does indeed become the new trend.
Oh yeah, want to add that I'm not a "every must be MAXXXED" visuals whore, rather I traditionally tune to get a steady 60fps - though I
have been spoilt since first going SLI a few years ago, I admit
Scoob.