Early Nvidia RTX 40 series Specifications Leak - Expect lots of VRAM!

Man, I'm so torn on this generation... I mean, I need a new GPU now so I'll get my first chance to buy an RTX 30 GPU and will definitely skip RTX 4000, but it looks to be the best NVIDIA generation in the last 10 years while at the same time being the worst...

These cards will probably be extremely unreliable having issues with cold solder joints thanks to extreme heat up and cool down cycles given so much power is being pulled from those circuits... At the same time I fully expect cards bending, crashing, probably catching fire again due to trying to deal with this amount of power. Not to mention electricity bills, and exploding PSUs with a card like this...

I mean you can easily have a 1000W computer if you have a 600W GPU in it... That's like a Small sandwich grill, but instead of using it for 10 minutes and turning it off you'll use it for hours and hours. NVIDIA will need to fix this power demand next generation around or I'll just have to ditch GPUs and play all my games from the cloud.

Yet... It seems it's the first generation period, were NVIDIA isn't skimping on VRAM... And that's great, mid-range and especially lower-end NVIDIA GPUs always tend to lose relevance a few years down the line mostly because they don't have enough VRAM for newer games even though they still have the power to run them... I still feel so bad about all the people who bought a GTX 970, or a 3GB 1060, that is less capable of playing some newer games than a 1050Ti just because it only has 3GB of VRAM... Not to talk about those who bought a 1GB 750Ti, poor people. An even with high-end NVIDIA cards usually losing driver support before having VRAM issues they would still play modern games way better if they had more VRAM even without proper driver support.
 
Last edited:
**Value Simulator 2.0**
/User_Alien_ALX
/Password **********
/OK
/10 calculate "Nvidia"+"Value"

**Your computer has crashed. Please try inserting more than 3TB of RAM for this calculation, as the memory overflowed when trying to compute this equation**
 
The "lots of VRAM" in the title made me expect more. It's more like NVIDIA will match AMD's current gen RAM.
 
lots of VRAM... their justification to raise the bar even more on outrageous pricing

Nvidia always have a pitch. IE, unless you go top end? they screw you and make you come back for more on the next round.

If Ampere had 12/16? like the 3070 and stuff? no one would bother. I bet you they timed it perfectly well with new console *ports* coming out that will need more.

That's why the £1000 3080 12gb exists.
 
It's a better amount but i'm expecting that while it's enough it'll be surpassed by the other 2 brands, going to make for an intresting end to the year to see who lands where.

Games aren't just yet pushing it all that much but now that loads of UE5 games are going to be on their way it'll be needed, still it's better than the 3gb and lower cards of a few gens back.
 
It will change fast as soon as we get the first next gen console game. It usually takes about a year after the consoles release before we start seeing them, as the dev cycle is usually about 3 years behind. So they will slowly start increasing the specs needed on PC.

Which usually means you look at what the console has to play with and then add at least 40% due to the nature of PCs being different and less efficient.

It won't bode well for 8gb users. That could soon become the new minimum, given we blew past 4gb quite a while ago (before the new consoles even launched).

They rely on high end sales by creating doubt at every other price point. I have talked about the psychology of this before. It's all mind games.
 
tbh not seen anything next gen tbh all more like best of the last gen ending sure going to see specs rise thou in a sharp way tbh i feel people with lower end cards even current gen will get left behind somewhat. I'm going to try skip a gen or 2 see how far i can make it last but i just hope i'll not need another cpu anytime soon. hopefully once i get the bios i need i should see some big jumps in performance checked up on 3dmark i should get 11k+ on port royal but with my card maybe i can eek out 12k kinda expecting higher frequencys as well not just more cpu grunt we'll see i'll update my other post on the card prize page when i get my grubby mitts on the bios but i'm not alone in waiting just glad i got a chip before they vanish and at a good price :)

UE5 is going to be the next big thing so many console games use it and maybe we'll see some new mmo's use it as well as they normally do, but it'll be nice to get some fresh eyecandy for all this new hardware to get a work out with :D
 
i just hope i'll not need another cpu anytime soon.

I would advise you to expect a need to upgrade your CPU. I've beeen experimenting with UE5 to a very small degree but for sure I've been participaing in UE5 developer communities and the thig likes CPUs... You know that city demo? From the Matrix Resurections? It runs CPU limited about 40fps with a 3090 GPU and a 10900k, apparently the card itself isn't even breaking a sweat but the CPU can't handle it.

But really only time will tell, microsoft does have an answer for CPU bottlenecks with some newer DX12 features and DirectStorage. Though, I really expect developers to only use DirectStorage quite a few years into the future not to limit their audience to win11 gamers exclusively or have kind of a hybrid approach were the games will require way more CPU power in W10.
 
Well if I can get a few gens out of the 5800x I picked up i'll be happy at least until ddr5 prices come down only reason i didn't wait for AM5.

I heard that moving forward some of it would mean that more ram would be requiered to keep up meaning 16gb would become bare minimum, I have 32gb ram atm.

I just got the 5800x as i'll gain a bump when they put out the 1206c/1207 bios for x370 makes it last a little longer, the 1700 has done me well but when i won the 6800xt it's really held it back even thou it performs well enough to play anything i've been missing out on full performance so when they said x370 would get a 5k series bios i was all over it like a dirty rash been driving me nuts for weeks :D

UE5 is going to change a lot of things thou just maybe a year before it's going to be a fully known thing, but i'm old so PC's are a never ending upgrade tbh.
 
It will change fast as soon as we get the first next gen console game. It usually takes about a year after the consoles release before we start seeing them, as the dev cycle is usually about 3 years behind. So they will slowly start increasing the specs needed on PC.

Which usually means you look at what the console has to play with and then add at least 40% due to the nature of PCs being different and less efficient.

It won't bode well for 8gb users. That could soon become the new minimum, given we blew past 4gb quite a while ago (before the new consoles even launched).

They rely on high end sales by creating doubt at every other price point. I have talked about the psychology of this before. It's all mind games.


Apparently Star Wars Jedi Fallen Order 2 will be the first game out next year that is solely PS5/XBSX/PC exclusive, No more cross gen as what they want to do the last gen consoles simply cannot handle and will also free up a lot of dev time so they won't have to continually downgrade various things to work on old hardware and listen to people with base PS4's/XB1's complain why a game from 2023 doesn't work perfectly on a near 10 year old console that was severely underpowered out of the gate.
 
Last edited:
That generation went on way too long, only issue now is that the new gen could end up being just as long or maybe they can bring out newer versions and somehow scale the performance better so that ps5 xbsx are still able to play the new games but at lower fps or res while still moving forward in progress. Thou other than stalker 2 and erm atomic punch idk the bioshockie style game not much comes to mind atm, need e3 era news to know whats coming but it all seem like end of year next atm.
 
Apparently Star Wars Jedi Fallen Order 2 will be the first game out next year that is solely PS5/XBSX/PC exclusive, No more cross gen as what they want to do the last gen consoles simply cannot handle and will also free up a lot of dev time so they won't have to continually downgrade various things to work on old hardware and listen to people with base PS4's/XB1's complain why a game from 2023 doesn't work perfectly on a near 10 year old console that was severely underpowered out of the gate.

Now is when all of the crap comes out on consoles. Like, new gen consoles. Even new gen games will be awful compared to what they knock out 3 years from now. It's always the same on new consoles. The early games are terrible compared to what they put out when they start using the actual features it added over the last gen.

I'm getting an Xbox later in the year. For that reason mainly. There's no point buying one just to run Fallout 4 faster. I want the next gen titles before I commit.

As for PCs? they can forget it. I really am done. If things get really cheap I might do small upgrades, but no more than £500. That is about what I spend on the 12700k gig and I am very happy with it.

That generation went on way too long, only issue now is that the new gen could end up being just as long or maybe they can bring out newer versions and somehow scale the performance better so that ps5 xbsx are still able to play the new games but at lower fps or res while still moving forward in progress. Thou other than stalker 2 and erm atomic punch idk the bioshockie style game not much comes to mind atm, need e3 era news to know whats coming but it all seem like end of year next atm.

No console is ever around for too long. Ever. The longer it exists? the better it is for every one. Firstly you don't have to continually keep buying expensive hardware but secondly as I pointed out the older it gets the better the games on it become. Continually changing? would just end us up with a load of crap buggy games that look terrible.

I have no hope that consoles will become like GPUs. Binned after two years and replaced. It just gives devs an excuse not to optimise anything correctly. Or, in other words? it would turn it into PC gaming. The one thing we all hate is buggy games that run like doggy toffees. Well that is what you would get.

Thankfully making a console is *far* more involved than Nvidia spunking out GPUs left right and centre. Or we would be headed for a complete nightmare.
 
Last edited:
Well I want things to move forward faster tbh, end of the day tech will reach a natural limit at some stage and even multi chip can't change that as at some point the cards would just become stupidly massive and power hungry to just power it all. So in my view push forward get to the natural limit and be happy it'll be the longest generation ever at that stage cause we'd have to find some off world exotic material to go any further if that makes sense.

Thing is while PC is becoming more the main platform as even capcom say it's there main as well as sega more so and others now and were getting more games from console way more often since they are basically small PC's now, so less reason for the consoles to hold the PC back in my view the PC could be returning to a golden era not so much like the past but more than in a long time.

Only issue really is the prices atm are at least in the short term killing it for loads of people somewhat myself and also you and others as your less inclined to invest into new gear atm.

I just feel the sooner we reach end game of hardware the better, then all the devs can focus on making the best games, sure they want to milk DXR and other tech but nanometers only go so far 100,000 nanometers in a human hair the scale they are working at already on 4nm is already mindblowing to even try to understand that at some stage the brick wall will be done and we'll be stuck totally.

Of the 3 console makers microsoft is the smartest as they are already doing what i expect will end up happening totally and that is subscription based releasing, in the end when it all go's how they would like your have to have a sub to play anything and it's the same for PC's they in the end wont sell you the hardware but rent you the hardware they will need to do this as tech will have hit that wall, yep i don't like it either it's just how i see it evolving over the coming years.
 
Last edited:
I would advise you to expect a need to upgrade your CPU. I've beeen experimenting with UE5 to a very small degree but for sure I've been participaing in UE5 developer communities and the thig likes CPUs... You know that city demo? From the Matrix Resurections? It runs CPU limited about 40fps with a 3090 GPU and a 10900k, apparently the card itself isn't even breaking a sweat but the CPU can't handle it.

But really only time will tell, microsoft does have an answer for CPU bottlenecks with some newer DX12 features and DirectStorage. Though, I really expect developers to only use DirectStorage quite a few years into the future not to limit their audience to win11 gamers exclusively or have kind of a hybrid approach were the games will require way more CPU power in W10.

I ran that demo and it ran like crap and didn't look much better.

That said I understand that is literally the alpha stage, and it will get much better. I also understand that a PC is not going to be the ideal way to run it either. It never is.
 
Man, I'm so torn on this generation... I mean, I need a new GPU now so I'll get my first chance to buy an RTX 30 GPU and will definitely skip RTX 4000, but it looks to be the best NVIDIA generation in the last 10 years while at the same time being the worst...

These cards will probably be extremely unreliable having issues with cold solder joints thanks to extreme heat up and cool down cycles given so much power is being pulled from those circuits... At the same time I fully expect cards bending, crashing, probably catching fire again due to trying to deal with this amount of power. Not to mention electricity bills, and exploding PSUs with a card like this...

I mean you can easily have a 1000W computer if you have a 600W GPU in it... That's like a Small sandwich grill, but instead of using it for 10 minutes and turning it off you'll use it for hours and hours. NVIDIA will need to fix this power demand next generation around or I'll just have to ditch GPUs and play all my games from the cloud.

Yet... It seems it's the first generation period, were NVIDIA isn't skimping on VRAM... And that's great, mid-range and especially lower-end NVIDIA GPUs always tend to lose relevance a few years down the line mostly because they don't have enough VRAM for newer games even though they still have the power to run them... I still feel so bad about all the people who bought a GTX 970, or a 3GB 1060, that is less capable of playing some newer games than a 1050Ti just because it only has 3GB of VRAM... Not to talk about those who bought a 1GB 750Ti, poor people. An even with high-end NVIDIA cards usually losing driver support before having VRAM issues they would still play modern games way better if they had more VRAM even without proper driver support.

I don't think Nvidia is going to release a £2000-3000 GPU that sags and bursts on fire. If they do, they'll lose vast quantities of money. Also, it's only the top-tier level card that will draw that much power. The cards that are actually affordable for 'normies' will have 'normal' TDPs. If we saw 600W on a £600 GPU, then I'd be worried. You don't buy a 1500W PSU for £100 because you know the componentry could not handle the load. They're all £300+ out of necessity. If these rumours are true and the 4090Ti or whatever it's gonna be called pulls 600W, it'll cost £2000-3000 and have military spec components or something like that to make it functional. The 3090Ti already proves that companies are capable of it.
 
Back
Top