WYP
News Guru
It looks like 16GB RTX 3070 Ti GPUs are on the horizon.

Read more about Nvidia's planned RTX 3070 Ti 16GB models.

Read more about Nvidia's planned RTX 3070 Ti 16GB models.
Last edited:
You missed the most important reason. Miners loooove memory. Every single refreshed model has one and only priority... Mining. 2060 12GB is hard proof of that.
And that 4K gaming needs more memory... Pfff... 3070 Ti is NOT a 4K gaming class card.
i mostly agree with you. probably that is the reason.
or just a reason to produce yet another card and reset MSRP.
thought i need memory not for gaming but for GPU rendering.
but i know i am in a minority.
So Rasterization is still very much a thing, and AMD beat Nvidia at every level with regards to VRAM. Which as we know now was a mistake, given that 8/10gb is not enough on a mid high end or high end card.
What do you mean by this really? How is 10GB not enough on a 3080 for example? Isn't it dependent on which resolution you play on?...
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.What do you mean by this really? How is 10GB not enough on a 3080 for example? Isn't it dependent on which resolution you play on?...
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.
Also there is the secret MSRP that Nvidia doesn't disclose. They can grab the piece of that overpriced cake for themselves. You can bet that the real MSRP is not that far from street prices of "new" GPUs.
And they want their marketing materials to look competitive when Intel lounches their GPUs.
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.
But seeing as 4K isn't that big yet, what about 1440p though? Aren't 10GB enough even by todays standards?
4k is big when Nvidia want it to be big. Like when it suits them. As soon as the 980ti came out they started banging the war drum. However, it seems people do have a limit. That or they went 4k, found out how hard it was to run and how much money you would continually have to keep spending every year just to keep your games running and went back to 1440p (like I did). Went from a Fury X to two Fury x (less said about that the better) to a single Titan Maxwell to a Titan XP. Before realising it sucked and got a 1440p monitor again.
Right now their war drum is clearly RT, DLSS and their "new" proprietary Gsync module. Again.
IE they will sell you any old s**t they can and think they did well. Only this sort of stuff never usually catches on and silently goes away until they can use it again.
Already been down the Gsync module path. AMD come along with Freesync and force Nvidia to do their non module version. Only now AMD have gone quiet it's back to modules again. Mostly because I would imagine RT and DLSS are not being raved about as much as they wanted them to be. There goes you killer app, move onto something else.
One thing is clear about their behaviour and that is that whilst people may have been stupid, desperate or needing to self congratulate themselves so badly they would pay a current Ampere GPU price, but it's also been made pretty damn clear that no one really cares about Nvidia's RT and DLSS. They just want faster cards that can run their games better, and rasterization is still more than enough for that.
And this is why I think Intel will do OK. Because it's already very clear that no matter how crap RTG has become they are still hanging in there and haven't gone away.
I'm a little confused why they just refreshed the 3080 with 12 GB only to bring a 3070 Ti with 16?! Is it a costs thing or why did they leave 4 extra GB on the table for the 3080?
The more you know! Thank you!It's tied to the memory bus width. The 3080 could theoretically have 16GB of VRAM, but it would then have to have a weird, possibly unmanfacturable memory bus width. For instance, 256-bit bus is 4, 8, or 16GB (6800XT for instance could be sold as an 8GB card, but not as a 12GB card unless they chopped the 256-bit bus up). 384-bit bus is 6, 12, or 24GB. You can cut that up as the 3080 did, but it's not always practical, seemingly.
It's tied to the memory bus width. The 3080 could theoretically have 16GB of VRAM, but it would then have to have a weird, possibly unmanfacturable memory bus width. For instance, 256-bit bus is 4, 8, or 16GB (6800XT for instance could be sold as an 8GB card, but not as a 12GB card unless they chopped the 256-bit bus up). 384-bit bus is 6, 12, or 24GB. You can cut that up as the 3080 did, but it's not always practical, seemingly.
You need Quadro for rendering, or whatever they are called now.