Gigabyte product codenames confirmed 20GB RTX 3080 and 16GB RTX 3070 plans

What?!?

Will future games need that much RAM?!? Games today need this much RAM!!

Play the Division 2 and it'll eat up more than 8gb on my 5700XT. That is if people knew what settings are and that they to past the coveted Ultra standard settings.

I'm getting a 24gb graphics card finally! And it's going to be epic!

People. Stop making click bait and being tards!!
 
Will future games need that much RAM?!? Games today need this much RAM!!

Play the Division 2 and it'll eat up more than 8gb on my 5700XT. That is if people knew what settings are and that they to past the coveted Ultra standard settings.

I'm getting a 24gb graphics card finally! And it's going to be epic!

People. Stop making click bait and being tards!!

On what settings? mine doesnt go above 6gb in Div2.

every sentence in your post seems like you are just trolling. How old are you?
 
I think some people are confusing "How much memory your graphics card claims to use" with how much graphics memory is actually being used to practically accelerate the performance, software tools cannot accurately tell you how much memory a game really requires at any given point, empty RAM is pointless RAM so most game engines will opportunistically fill any free VRAM with anything potentially useful in the future, and many game engines will be able to fill huge amounts of VRAM in a somewhat useful way, but that's not to say that without this the game would tank. Essentially my point is just that software tools for reporting VRAM are inherently inaccurate and shouldn't be used to predict your GPU purchasing options.

Of course though, VRAM needs are still going to grow quickly for a bit with the new consoles shifting the baseline, and that will be particularly large for those who want native 4K experiences (However on both consoles and PC this is likely to fall out of favour for ML upscaling approaches to get essentially visually indistinguishable (eventually) "fake 4K"). But to say 20GB is required by any game today I think is an overstatement; Most PC games will still have their max settings optimised for the 11GB RTX2080Ti, and with Amperes compression gains that's probably more or less matched by the current 3080 in effective VRAM capacity.

Essentially, I don't think it's a particularly underhand move for NVidia to release cards with the technology that's possible today (Cards based around 1Gb chips) rather than delaying the whole series until Micron had 2Gb chips ready, there's definitely a market for both configurations and the larger sizes are still going to be somewhat niche in PC (4K gaming seems to have fallen aside for 1440p high refresh rate).
 
Last edited:
In addition, circumventing VRAM shortage is fairly trivial. Just drop texture quality - going from Ultra to High textures is barely noticeable in most titles. Though I'm not aware of any titles which would struggle with 10GB, even at 4K.
 
In addition, circumventing VRAM shortage is fairly trivial. Just drop texture quality - going from Ultra to High textures is barely noticeable in most titles. Though I'm not aware of any titles which would struggle with 10GB, even at 4K.

I know of 3 which is nothing given the pool of titles we have that are demanding at 4k.

Resident evil 2
FF15
Tomb raider SotTR
 
On what settings? mine doesnt go above 6gb in Div2.

every sentence in your post seems like you are just trolling. How old are you?

Yeah, I was thinking the same... Especially on their last sentence, were they talking about their own post? But I passed on saying anything. The person has posted similar, troll looking, posts before.
 
In addition, circumventing VRAM shortage is fairly trivial. Just drop texture quality - going from Ultra to High textures is barely noticeable in most titles. Though I'm not aware of any titles which would struggle with 10GB, even at 4K.

I agree with you. I do. But do you know how offensive "drop the texture quality" sounds to the PCMR?

I'm not daft, but I have to say even though I don't agree with them they do have a solid point. Why would you pay so much for a PC GPU if you can't max everything out?

That's the sole reason most people game on a PC. Ignoring Esports of course.
 
I agree with you. I do. But do you know how offensive "drop the texture quality" sounds to the PCMR?

I'm not daft, but I have to say even though I don't agree with them they do have a solid point. Why would you pay so much for a PC GPU if you can't max everything out?

That's the sole reason most people game on a PC. Ignoring Esports of course.

Graphical fidelity has always been 3rd in my list. For me frame smoothness will always be more enjoyable. Second reason is the title availability. All titles I play mostly, are not available anywhere else.
 
Agreed with Warchild, with addition of responsiveness, which is very much tied to smoothness.


Flexible input methods are also a big deal for me. I don't want to be restricted to a controller, nor do I want to be at developers' mercy if they make bone-headed decisions with control binds.


The "PCMR" who refuse to drop settings for decent frame rates are suckers.
 
Graphical fidelity has always been 3rd in my list. For me frame smoothness will always be more enjoyable. Second reason is the title availability. All titles I play mostly, are not available anywhere else.

Yeah but we are all pretty much sensible adults here.

I can promise you that other forums I have been visiting don't see it from the same perspective. In fact I got cussed out for proving my 2080ti was more than enough for any one at 1440p and got called salty and bitter.

Common sense does not apply to GPUs costing £1300 it seems.

Whilst every one was getting awful excited I rushed out and bought another 2080Ti.
 
We can possibly see 20 GB variant before Micron's hihger capacity chips. Nvidia and AIBs have 3090 PCBs already developed. They need to put 3080 die on and populate 20 out of the 24 memory spaces.
 
Whatever the approach, I doubt 200€ will be enough for that extra 10GB - making those cards DoA in my opinion.
 
Yeah but we are all pretty much sensible adults here.

I can promise you that other forums I have been visiting don't see it from the same perspective. In fact I got cussed out for proving my 2080ti was more than enough for any one at 1440p and got called salty and bitter.

Common sense does not apply to GPUs costing £1300 it seems.

Whilst every one was getting awful excited I rushed out and bought another 2080Ti.

But you are salty and bitter. You're British :harhar::rollinglaugh:
 
On what settings? mine doesnt go above 6gb in Div2.

every sentence in your post seems like you are just trolling. How old are you?

Key here is different strokes for different folks. And I don't have to prove anything to you GUY. I unfortunately do not have a picture of The Division 2, but I do have one running Cryengine's Neon Noir Demo and it easily surpassing 8gb at 1080. I do play with settings above that of Ultra as there are additional settings to be had for reflections and shadows if you must know.

And on my 2080Ti system @ 5120x1440 The Division 2 uses up more than 10Gb of VRAM, so yeah these new GPU's are truly a joke without more than 12 or 16gb.

2080Ti The Division 2
Dc5XaIb.jpg

5700XT Neon Noir https://i.imgur.com/lIiDDLx.jpg

Man I'd love to be the "kid" with that setup....who's trolling who?
 
Not sure where my original post went to, but here is the short of it. I am glad your system does what it does. And your other messages speak as to why your system wouldn't use up more than your said amount of RAM... You do not max the settings out....wow that was a tough one to figure out, but it is possible.

Here is Neon Noir 1080 5700XT: https://i.imgur.com/lIiDDLx.jpg
The Division 2 on my 2080Ti 5120x1440:https://i.imgur.com/Dc5XaIb.jpg
HWiNFO: https://i.imgur.com/QjVEPQW.jpg

Be sure to zoom in and check the memory usage in task manager there guy.

Unfortunately I do not have a picture of my 5700XT running the Division 2, but I do run the game over Ultra settings, so that could help out as well...
 
Agreed with Warchild, with addition of responsiveness, which is very much tied to smoothness.


Flexible input methods are also a big deal for me. I don't want to be restricted to a controller, nor do I want to be at developers' mercy if they make bone-headed decisions with control binds.


The "PCMR" who refuse to drop settings for decent frame rates are suckers.

I agree.

I consider myself a part of the "PCMR". Yet rarely ever have I been able to afford a system that could ran everything at the highest settings. I never really cared to. It's satisfying playing older games because you can do that but still hit 100 FPS. But even older games with its one superfluous and unoptimised setting crippled certain systems. I can think of a number of titles where the absolute highest settings was 40-60 FPS, and one setting turned down one notch (with no discernible visual compromise) was 90-100 FPS. GTA V, Splinter Cell, Assassin's Creed are some examples. I didn't sit there in anger because I got shafted by 'Ngreedia' or 'RTG-durrrp'. I just turned the setting down and played the bloody game and stopped taking myself so seriously.

The PCMR encompasses multiple different people. Modders, high refresh rate competitive gamers, streamers, hardware enthusiasts, AAA, they're all part of the PCMR. I've played games at the lowest settings and loved every minute of it, even when I hated it. And I've played games at 20 FPS and loved every minute of it, even when I hated it. But nowadays if I had to pick between 90 FPS with two settings turned down at a price of €500, 90 FPS with all settings turned up to the max for €800, or 60 FPS and settings cranked for €500, I'd choose the first of the three. All three are "PCMR". The reason the PC is the "Master Race" is because you can choose, not because you have a zero compromise system. It isn't just that one thing.
 
I agree.

I consider myself a part of the "PCMR". Yet rarely ever have I been able to afford a system that could ran everything at the highest settings. I never really cared to. It's satisfying playing older games because you can do that but still hit 100 FPS. But even older games with its one superfluous and unoptimised setting crippled certain systems. I can think of a number of titles where the absolute highest settings was 40-60 FPS, and one setting turned down one notch (with no discernible visual compromise) was 90-100 FPS. GTA V, Splinter Cell, Assassin's Creed are some examples. I didn't sit there in anger because I got shafted by 'Ngreedia' or 'RTG-durrrp'. I just turned the setting down and played the bloody game and stopped taking myself so seriously.

The PCMR encompasses multiple different people. Modders, high refresh rate competitive gamers, streamers, hardware enthusiasts, AAA, they're all part of the PCMR. I've played games at the lowest settings and loved every minute of it, even when I hated it. And I've played games at 20 FPS and loved every minute of it, even when I hated it. But nowadays if I had to pick between 90 FPS with two settings turned down at a price of €500, 90 FPS with all settings turned up to the max for €800, or 60 FPS and settings cranked for €500, I'd choose the first of the three. All three are "PCMR". The reason the PC is the "Master Race" is because you can choose, not because you have a zero compromise system. It isn't just that one thing.

Totally agree with you.
 
Back
Top