Micron calls GDDR7 memory capacity a "performance bottleneck" as Nvidia's RTX 50 SUPER series remains MIA

WYP

News Guru

Micron calls GPU memory a "performance bottleneck" after Nvidia RTX Super delay/cancellation.​


Micron-GDDR7-Bottleneck.jpg


Read more about Micron calling GPU memory a "performance bottleneck".
 
Memory manufacturer who makes immense profit from memory -

"More memory is better so GPU's should have more memory buy more memory !"

:ROFLMAO:
 
TBH, I feel like this was written a long time ago as part of Nvidia's planned RTX 50 SUPER marketing push. Micron has been a very strong partner for Nvidia on the GeForce side. They have always been the ones to get behind their GDDR?X standards fastest etc.

Sadly, the situation has changed. Micron are focusing its resources on where the profits are. That means less GDDR7 production and no RTX 50 SUPER. That doesn't mean that Micron don't need to rub it in and tell us we need more memory when we can't have it.
 
TBH, I feel like this was written a long time ago as part of Nvidia's planned RTX 50 SUPER marketing push. Micron has been a very strong partner for Nvidia on the GeForce side. They have always been the ones to get behind their GDDR?X standards fastest etc.

Sadly, the situation has changed. Micron are focusing its resources on where the profits are. That means less GDDR7 production and no RTX 50 SUPER. That doesn't mean that Micron don't need to rub it in and tell us we need more memory when we can't have it.

Indeed, But memory can help in a lot of instances in terms of smoothness as less assets need to be swapped out, I remember when the Star Wars Titan Xp's came out with 12GB of memory, Basically a full fat 1080 Ti, And that simple 1GB of extra memory made a pretty nice difference to certain games smoothness, Wasn't immense but you could "feel" it ingame.

Although Micron are going a little over the top with saying it's a bottleneck, Yeah sure in scenarios where 16GB isn't enough but that is rare unless you run 4K maxed out path tracing etc....
 
Last edited:
Indeed, But memory can help in a lot of instances in terms of smoothness as less assets need to be swapped out, I remember when the Star Wars Titan Xp's came out with 12GB of memory, Basically a full fat 1080 Ti, And that simple 1GB of extra memory made a pretty nice difference to certain games smoothness, Wasn't immense but you could "feel" it ingame.

Although Micron are going a little over the top with saying it's a bottleneck, Yeah sure in scenarios where 16GB isn't enough but that is rare unless you run 4K maxed out path tracing etc....
Honestly 8GB is fine still. I was still playing modern games at 1440p with 8GB. 3070ti required me to lower texture sizes but it's still very doable.

It's definitely not future proof but gamers who currently have those cards and can't upgrade should be fine. But again buying a card in 2026 with 8GB is just asinine.
 
Honestly 8GB is fine still. I was still playing modern games at 1440p with 8GB. 3070ti required me to lower texture sizes but it's still very doable.

It's definitely not future proof but gamers who currently have those cards and can't upgrade should be fine. But again buying a card in 2026 with 8GB is just asinine.

You can make an 8GB GPU work, but it requires a lot of fiddling. Gamers will be fine if they know what they are doing. That said, higher VRAM GPUs give people a much easier time.
 
You can make an 8GB GPU work, but it requires a lot of fiddling. Gamers will be fine if they know what they are doing. That said, higher VRAM GPUs give people a much easier time.

Yep, Going forward with 3GB 30Gbps+ modules likely to be common place, 18GB should be the starting point IMO for lower mid range, 8GB needs to be phased out of being used outside anything but the low low budget end as prices of 3GB modules were only $10 each as of 1-2 months ago for the average consumer, Even at a theoretical doubling of the price 18GB would only cost $120, Although for vendors getting BIG bulk order discounts that cost would likely be substantially lower.
 
My rule of thumb is that you will have no issues if your GPU has as much RAM as the consoles of the time. 8GB GPUs had no problems for the entirety of the PS4 era, and 16GB GPUs will be great for the full PS5/Series X generation IMHO.

Yes, the consoles share that RAM for everything, but having that on the GPU alone mitigates any inefficiencies PC may have, higher PC settings, and potential high-res texture packs, at least for the most part.

As we move further into this console generation, GPU memory capacity needs to increase. Otherwise, new GPUs will age poorly. 12GB needs to be the new minimum, with 16GB becoming mainstream.
 
Back
Top