Nvidia's reportedly considering two different specifications for its RTX 4070

Is this their way of making the cut down version at the previous full price, and then the higher end for a premium when in reality the high end is nothing more than the standard version anyway?
 
Is this their way of making the cut down version at the previous full price, and then the higher end for a premium when in reality the high end is nothing more than the standard version anyway?

Nah, in other words they are waiting to see what AMD will come with :D

AMD turn up? 12gb. AMD don't? ration you to 10gb.

Won't be a huge issue. Yet. Once the consoles start to get to reach the 8gb or whatever it is they have and the PC pushes way past that it will be a problem.
 
Nah, in other words they are waiting to see what AMD will come with :D

AMD turn up? 12gb. AMD don't? ration you to 10gb.

Won't be a huge issue. Yet. Once the consoles start to get to reach the 8gb or whatever it is they have and the PC pushes way past that it will be a problem.


IMO anything in the xx70 range and up should not come with anything less than 16GB of memory, If the 4070 is on the level of a 3080-3090 then it will be 4K capable and games at 4K are starting ton eat VRAM up pretty quickly.
 
IMO anything in the xx70 range and up should not come with anything less than 16GB of memory, If the 4070 is on the level of a 3080-3090 then it will be 4K capable and games at 4K are starting ton eat VRAM up pretty quickly.

Yeah but that isn't how Nvidia work dude. They don't want to give you a card that will last too long, without getting you to buy the higher end card.

3070 was less than half of the price of a 2080Ti. And arguably as good (I argue it isn't, but there is no denying it was indeed much cheaper, if you could ever get one from Nvidia without paying out the anus for a plastic fantastic card from a board partner).

It is pretty obvious where Nvidia made the cuts.

I know plenty of people would want to argue with me over this, but ask yourself this. We went from a 1070 and 1080 with 8gb. To a 2070 and 2080 with 8gb. To a 3070 with 8gb.

Now the first ones? maybe that was a touch of overkill. At the time however they needed a way to convince you to spend what both of those cut down mid tier cards cost. Because as good as they were there was no denying *what* they were. The 1080 and 1070 were both cut down small mid tier dies, soon surpassed by the 1080Ti.

So OK, was 8gb enough on the 20 series? no. By then issues were starting to occur. If you look here at what happens to the 2080 VS the 1080Ti? you can see the issue.

wo4AW6X.jpg


Look at how it gets pasted by the 2080. Right up until you get to 4k. The cause? it doesn't have enough VRAM for UN settings at 4k. The thing is? if it were a stupid test I could say that yeah, why are you bothering to run 4k on that game with a card that isn't capable? The thing is both cards are clearly capable of running that game in 4k UN with more than acceptable framerates, only the 2080s lead has fallen off a cliff because it is now taking textures from the DRAM and paging file on your HDD.

When I posted that the other day the guy accused me of lying and said it must have been another reason. Thing is? I watched that whole video, and Doom Eternal shows you how much VRAM the card is going to use before you even try it. The only reason they didn't lower the settings? well that would have been cheating.

So, it will come as no surprise that I am still very firmly in the "8gb is not enough" category (because the 4070 will smash 4k gaming with all current games TBH) but at the same time I know what Nvidia are like and spoon feeding you with crumbs, so I won't hold my breath.

It also seems I massively got my wires crossed on Navi 3, and it could well be AMD's Maxwell moment. Not only that, but they have found a way to reduce the die sizes massively with the same density, so when it comes to cost Navi 3 should be pretty cheap in comparison.

And will RT performance really matter by then? no, no I don't think it will.
 
Back
Top