Yes, Nvidia's RTX 4070 is still available for MSRP

After watching Hardware Unboxed's video on VRAM usage@4k, my point of view of this card shifted from "this is a bad card" to "this is a pathetic card". In fact I had only watched the reviews, while skipping the specs and I didn't previously read the specs, I watched the reviews supposing it was a 16GB card, and now I know that even the 4070Ti is a 12GB card...

the statement "this is a solid card" I feel is really giving NVIDIA too much benefit. This is NOT a solid product or good card, this just less bad than most of the other NVIDIA cards.

People buying GPUs should still be buying used. I'd be buying used if I was on the market. I'm much more inclined to buy an used 3080 12GB from a miner than a new 4070 that has the same performance but would cost me much more, while not being any better at handling newer games that require ever increasing amounts of VRAM.

And keep in mind I'm ignoring the fact the emmory bandwith is quite lower on the 4070 compared to the 3080, I would expect more issues with the 1% lows, and/or textures not loading on the 4070 than on the 3080 when playing demanding games.
 
Last edited:
Its a solid card for the majority of gamers not gaming at 4k.

Regardless of which, most monitoring software shows the allocation requests from the GPU driver, the actual amount used is very often not the reported quantity in what ever software you use. This scales with memory capacity.

If the game on a 12GB requests all 12GBs, you'll see that reflected but for all you know it's only actively using 6GB.

I'm not saying NO games use all the available vram, but most people these days blow it out of proportion.
 
Its a solid card for the majority of gamers not gaming at 4k.

Regardless of which, most monitoring software shows the allocation requests from the GPU driver, the actual amount used is very often not the reported quantity in what ever software you use. This scales with memory capacity.

If the game on a 12GB requests all 12GBs, you'll see that reflected but for all you know it's only actively using 6GB.

I'm not saying NO games use all the available vram, but most people these days blow it out of proportion.

The majority of gamers are not spending £600 on a graphics card.
 
Its a solid card for the majority of gamers not gaming at 4k.

Regardless of which, most monitoring software shows the allocation requests from the GPU driver, the actual amount used is very often not the reported quantity in what ever software you use. This scales with memory capacity.

If the game on a 12GB requests all 12GBs, you'll see that reflected but for all you know it's only actively using 6GB.

I'm not saying NO games use all the available vram, but most people these days blow it out of proportion.

Well, first off, this is a 600 GPU! I did not spend this much on my GPU and I am playing at 4k, your argument of most gamers aren't playing at 4k goes both ways, most are also not spending this much on a GPU. Also if you are spending this much in a GPU then even if you're playing at 1440p you're probably considering upgrading to 4k at some point, would be nice to have a GPU capable of that.

Secondly, Yes, they Are using over 11GB of VRAM and one of them almost 16GB, thats the literal reason why hogwarts legacy had bad performance for many gamers at launch, the stuttering was caused by the game loading textures, it was only after an update that made the game load low res textures and then try to load high res during the next seconds that the stuttering on this game disappeared (it also has to do with memory bandwith games with faster GDDR memory or just larger memory bus had less issues with that) other recent games with actual performance issues caused by a lack of VRAM and/or low memory bandwith include RE4 and The Last of Us part 1.

I can even tell you why this is happening: DEVs are just making the memory management systems on their games to take full advantage of the large RAM/VRAM capacity of consoles, and when porting to PC they're failing to understand many gamers have hardware with low VRAM, and the VRAM is actually separate from RAM on PC, so they launch the game expecting 16GB and then you only have 8GB... Since they didn't build the management system to handle lower amounts of VRAM, instead of rendering low res images, the game stutters while loading. So after launch they receive backlash and go back to rework the entire memory system. Will this get better over time? I wouldn't bet on that. DEVs will simply state you need a 12GB GPU for 1440p and if you don't that's just sad for you. Of course you can always just lower texture quality, but then again, U$600 on a GPU to play on lower graphical settings? Also look at the size of the 4070 die, it's WAY SMALLER than a 2060, its power consumption is rather low for a 70 class card of 2023, this is literally the chip that would be a 60 class card in other generation but they are selling as 70 class on an 80ti class price, and you are still defending them? Oh silicon prices are higher, yes, they are like 30% higher, NVIDIA is charging 400% the price per square milimiter of die area!
 
Last edited:
Well, first off, this is a 600 GPU! I did not spend this much on my GPU and I am playing at 4k, your argument of most gamers aren't playing at 4k goes both ways, most are also not spending this much on a GPU. Also if you are spending this much in a GPU then even if you're playing at 1440p you're probably considering upgrading to 4k at some point, would be nice to have a GPU capable of that.

Secondly, Yes, they Are using over 11GB of VRAM and one of them almost 16GB, thats the literal reason why hogwarts legacy had bad performance for many gamers at launch, the stuttering was caused by the game loading textures, it was only after an update that made the game load low res textures and then try to load high res during the next seconds that the stuttering on this game disappeared (it also has to do with memory bandwith games with faster GDDR memory or just larger memory bus had less issues with that) other recent games with actual performance issues caused by a lack of VRAM and/or low memory bandwith include RE4 and The Last of Us part 1.

I can even tell you why this is happening: DEVs are just making the memory management systems on their games to take full advantage of the large RAM/VRAM capacity of consoles, and when porting to PC they're failing to understand many gamers have hardware with low VRAM, and the VRAM is actually separate from RAM on PC, so they launch the game expecting 16GB and then you only have 8GB... Since they didn't build the management system to handle lower amounts of VRAM, instead of rendering low res images, the game stutters while loading. So after launch they receive backlash and go back to rework the entire memory system. Will this get better over time? I wouldn't bet on that. DEVs will simply state you need a 12GB GPU for 1440p and if you don't that's just sad for you. Of course you can always just lower texture quality, but then again, U$600 on a GPU to play on lower graphical settings? Also look at the size of the 4070 die, it's WAY SMALLER than a 2060, its power consumption is rather low for a 70 class card of 2023, this is literally the chip that would be a 60 class card in other generation but they are selling as 70 class on an 80ti class price, and you are still defending them? Oh silicon prices are higher, yes, they are like 30% higher, NVIDIA is charging 400% the price per square milimiter of die area!

I think you may have misunderstood NBD. I don't think he's defending Nvidia. I think he's suggesting the idea that some are blowing this a bit out of proportion. Yes, it's a problem. Yes, Nvidia is a greedy, self-centred company that's only prerogative is to milk as much from consumers as possible. But the 4070 is not the worst card they have released. It's bad, but not the worst. I agree that this should have been the 4060 and should have been $350-500 depending on the variant, and it should have had 10GB or more of VRAM if possible. If bus width had to remain the same, a 16GB variant should have been made available for an extra $50 for those that need it. But I've honestly seen worse from Nvidia and AMD before. That's not defending them; that's being realistic.
 
Well he is being realistic in a way. Hardly any one spends this much on a GPU. And if they do they don’t use it on Steam.

It’s not selling very well. I don’t know if I’m surprised by this news or not. All of the models OCUK has in, even the sub £600 ones are all still in stock.

One thing I have never got my head around is the maths here. When PC gaming is not popular it costs half the price. I’ve never understood that as it means less sales at lower prices *scratches head* yet when it’s more popular and sells in higher volumes the prices go up *scratches head again*. It’s the only industry where that happens. Like, when something new like say LED lighting was invented it was expensive at first. Yet, once it gets accepted and adopted the competition rises and prices fall. To the point where any one can have it. It’s the same with TVs too. I bought one of the first 4k 65” TVs to hit the market. It was literally just a monitor and only had one 4k input. No tuner nothing. It cost me £1399.

Last year I bought a 75” Ambilight 4k with Atmos, Dolby vision, all 4k android and gawd knows what else for £750. 8 years later.

Problem they face now? They had a very peed market. Covid was a thing (it still is, had a dose of it myself recently) but people were in lockdown so material crap was a real thing. It isn’t now. Younger people learned a valid lesson during lockdown. And that was don’t sit the F at home on a computer. Get out there.

As such these things are waaay less in demand and thus way less valuable. Nvidia and AMD need to now realise this. And they will, trust me.
 
Last edited:
The majority of gamers are not spending £600 on a graphics card.

In addition to what AngryGoldFish said, this only further proves my point. There are very few gamers spending this much money on a card as well as few playing at 4k with modern AAA titles.

If you play at 4k you need more power anyway. If you don't play at 4k you're probably the majority still at 1080p and this card will do fine but you probably aren't spending this much money anyway. See my point? For the majority it's a good card for what it offers, this doesn't mean it's pricing is good or it's slot placement in the lineup is smart.
 
I can agree on all viewpoints on this, but i'll add to it by saying that crypto market means that Nvidia simply wants to charge you more to off set the cost, they sell less cards but make more money in the longer term. They aren't going to change it imho even if amd and Intel started taking market share nvidia wouldn't care and would reduce the amount they make and re-aim it all into the AI sector it's where they make more money.

It's why they make statements like crypto is bad for the world and doesn't add anything useful or how they care about gamers so much, i do not believe it, they cut down the bus width heavily to save on silicon and sell you a cut down version of what you should have had.

Alien and me butt heads lots at times, but he's right about the younger people wanting to get out of the house as much as i think covid was a load of orchestrated BS people sure want out of the house this time of year.

I can say this much expect the 5000 series to cost the same or higher and your never see a £500 xx70 series card again, even if it's a better option than most of the stack it's not what they could have made.

at some point your get more vram on there cards, but your not get higher bus width they simply want to make cheap lower size dies and outside of there halo products which they want bank for it's simply how they have decided to go about things.

The issue is they lead the market and so long as people keep following them it'll not change, Intel and AMD i do feel in time will take market share and the thing is nvidia will not care at all, they are so heavy into all the things I hate cloud gaming AI and the metaverse, so as far as i'm concered they can keep it, while the rest of of the mass market is begging for a return to sane prices and cheaper nvidia cards.

in the end those that want a GPU and an nvidia one will buy it in time and nvidia knows it so they do not care at all.
 
One thing I have never got my head around is the maths here. When PC gaming is not popular it costs half the price. I’ve never understood that as it means less sales at lower prices *scratches head* yet when it’s more popular and sells in higher volumes the prices go up *scratches head again*. It’s the only industry where that happens. Like, when something new like say LED lighting was invented it was expensive at first. Yet, once it gets accepted and adopted the competition rises and prices fall. To the point where any one can have it. It’s the same with TVs too. I bought one of the first 4k 65” TVs to hit the market. It was literally just a monitor and only had one 4k input. No tuner nothing. It cost me £1399.

Last year I bought a 75” Ambilight 4k with Atmos, Dolby vision, all 4k android and gawd knows what else for £750. 8 years later.

Haha funny how I’m not the only one with the head scratching at times to these thoughts :lol:
 
Well, first off, this is a 600 GPU! I did not spend this much on my GPU and I am playing at 4k, your argument of most gamers aren't playing at 4k goes both ways, most are also not spending this much on a GPU. Also if you are spending this much in a GPU then even if you're playing at 1440p you're probably considering upgrading to 4k at some point, would be nice to have a GPU capable of that.

I'm on 1440p and have literally no desire of going up to 4K. 1440p is the sweetspot and plenty for most peoples needs. 4K is overrated, just takes a lot more to actually run said games on that resolution for the quality advantage just isn't there in my opinion.

Apologies for double posts folks... For some reason, at times, I can't neither quote nor edit said quote on Firefox on OC3D. Which is strange, since it only happens on this forum and quite annoying at times.
Needed to quote Alien's post on my phone and just couldn't be arsed editing it again with this quote.
 
I'm eating an icecream lolly as I type; six months ago they were three for a pound, now they are three for £1.40. Do the maths and stop whinging. In yesterday's money the 4070 comes out at about the price you think it should be, if not better.
 
I'm eating an icecream lolly as I type; six months ago they were three for a pound, now they are three for £1.40. Do the maths and stop whinging. In yesterday's money the 4070 comes out at about the price you think it should be, if not better.

I used a similar example the other day, with hoola hoops they used to be £1.10-25 now they are £2.25 but aparently the tech industry is different they said as margins are 10-15% I laughed and stated more like 65% GDP to them.
 
I used a similar example the other day, with hoola hoops they used to be £1.10-25 now they are £2.25 but aparently the tech industry is different they said as margins are 10-15% I laughed and stated more like 65% GDP to them.

It's greed. Not inflation. End.Of.

None of these prices from these companies are in line with inflation. That is absolute horse apples. They are just taking the pee, end of.

Let's use Uncle Ben's micro rice as an example. I used to buy this. Note, used to. It was about a quid a pack and they do various flavours. My fave being the Mexican one, as I eat A LOT of meheecan food.

It went to £1.25 a pack. It then went to £1.50 a pack. The irony is what they printed on the front "Many people are going to go hungry etc donate to this charity" but you don't mind contributing to that huh?

The fix? easy. Get myself to Aldi where I can get the same thing for about 40p. And it has not gone up 1p not even during Covid.

This is all of the brand names being greedy and capitalising as usual on other's misfortune. The F-ed up part is? last week I got some coffee for £3. It is usually about £4, but they put it up to SIX QUID. I would surmise no one actually bought it, and thus they had to reduce it below their new RRP just to get it the F out of there.

utaRG6y.png


11% huh? LMFAO, right. Let's try 90%.

That is just plain disgusting, no matter how you try and shake it. I no longer buy it at all, and I won't even if they drop their prices.

Like I said in another thread (and got laughed at) inflation is a made up word and it just means bend over.
 
inflation is a real thing when governments print money the value drops, quantitative easing lol they know what they are doing i.e your paying for it, same for any sector but of course companys are going to take the pee.

Still while we disagree on some things we agree on others.

As for the price of the 4070 it could have been worse but it could have been better like i said the 6950xt at this price point is a better option in my view.

I know my weekly shop has gone up on what we buy a good £20+ just depends.

It's not going to change too many are hoping it will, but it just wont.
 
inflation is a real thing when governments print money the value drops, quantitative easing lol they know what they are doing i.e your paying for it, same for any sector but of course companys are going to take the pee.

Still while we disagree on some things we agree on others.

As for the price of the 4070 it could have been worse but it could have been better like i said the 6950xt at this price point is a better option in my view.

I know my weekly shop has gone up on what we buy a good £20+ just depends.

It's not going to change too many are hoping it will, but it just wont.

It's greed. Your crisps have doubled in price (so 100%) and that rice has gone up 90%. Inflation is 11%. I am not saying CORE inflation is not real, but the distorted greed it causes is not inflation. It's just an excuse for greed.
 
Back
Top