Nvidia RTX 3070 Ti 16GB graphics cards have been spotted on the EEC website

You missed the most important reason. Miners loooove memory. Every single refreshed model has one and only priority... Mining. 2060 12GB is hard proof of that.

And that 4K gaming needs more memory... Pfff... 3070 Ti is NOT a 4K gaming class card.


Edit: Also people who are buying 4K 144Hz+ gaming monitors for north of a 1000$ do not pair them with 3070 Ti.
 
Last edited:
You missed the most important reason. Miners loooove memory. Every single refreshed model has one and only priority... Mining. 2060 12GB is hard proof of that.

And that 4K gaming needs more memory... Pfff... 3070 Ti is NOT a 4K gaming class card.




i mostly agree with you. probably that is the reason.
or just a reason to produce yet another card and reset MSRP.


thought i need memory not for gaming but for GPU rendering.
but i know i am in a minority.
 
i mostly agree with you. probably that is the reason.
or just a reason to produce yet another card and reset MSRP.


thought i need memory not for gaming but for GPU rendering.
but i know i am in a minority.


Yes. Rendering is a valid reason. It will allow you to render bigger projects without crashing. But Nvidia doesn't want you to use 3070Ti for rendering. It is a GAMING card. You need Quadro for rendering, or whatever they are called now.
 
Most of this is just peeing the highest. They want to eradicate any reason for AMD to have a "one up". Even though AMD cards don't support hardware accelerated ray tracing or etc due to the difference Nvidia still see it as peeing on their patch.

I doubt many will get a 3090Ti, even if they want one and would pay the bonkers price. Same goes for this. It's simply to one up AMD on everything throughout their product stack. RT and DL and etc have simply not been the trump card Nvidia thought they would be, and many reviewers refused to solely focus on it as Nvidia had asked them to.

So Rasterization is still very much a thing, and AMD beat Nvidia at every level with regards to VRAM. Which as we know now was a mistake, given that 8/10gb is not enough on a mid high end or high end card.

This all takes me back to the Titan Black. It was incredibly short lived, and only really done because the 780Ti was 3gb and AMD's 290x was 4gb. So Nvidia released the 6gb Titan black and it was available for about 3 months before they moved to the 970 and 980 and then onto the Ti.
 
So Rasterization is still very much a thing, and AMD beat Nvidia at every level with regards to VRAM. Which as we know now was a mistake, given that 8/10gb is not enough on a mid high end or high end card.


What do you mean by this really? How is 10GB not enough on a 3080 for example? Isn't it dependent on which resolution you play on?...
 
What do you mean by this really? How is 10GB not enough on a 3080 for example? Isn't it dependent on which resolution you play on?...

It's dependent on resolution and how you run the card.

As much as I love DLSS there aren't many games I play that do not use it. As such you are running the full resolution at all times, which uses a lot of VRAM. The last case of this for me was Far Cry 6. Even with my 2080Ti and its 11gb I got 5 crashes where it pre warned me it was running out. IDK if it has DLSS now but it did not when I played through it.

Even when the 3070 launched and was hailed the half price 2080Ti it had glaring issues. Even Borderlands 3 ran out of VRAM and made the card way slower than the Ti. This wasn't a good sign, as BL3 was hardly pushing boundaries on graphics and VRAM use.

As such the same would apply to the 3080. If the 2080Ti does not have enough RAM for 1440p then the 3080 certainly does not have enough for native 4k. And it doesn't. This issue is gotten around by using DLSS, but many enthusiasts I know refuse to use that as it is not native, and as such at points it will look worse.

Which TBH? is a bit stubborn but I do see it from their side. Me? I would happily use DLSS all day. However, I can see why expectations would be high if you really cared.

Nvidia always do this. Always. They always give you just enough to scrape by on unless you go up the range and spend far more money. It is a tactic we were trained on when I worked in a sales call centre. The lower end products were simply a diversion to make you buy the "Super bargainous" top end product as apparently it was much better value and lasted much longer.

The same technique is now used in motherboards. All over, by every company. This is how they have been getting away with charging up to £1000 for a motherboard. What you do is make your product stack bloated, confuse people, and then only send out the top end stuff for review. This then influences people who don't trust in products without glowing "reviews" into buying more expensive boards with higher mark ups.

It's almost solely based on psychology. Tricking people who think they are clever. It's also used on cars, and pretty much anything else you can buy today where you would need the advice of a salesperson before buying.

And of course "up selling". What I mean by that? getting someone to buy or have something they don't want or need by convincing them they need it.

You can read more about that here. Also note the use of the word "cross sale" which means selling someone something else they probably had no intention whatsoever in buying.

https://corpgov.law.harvard.edu/2016/12/19/the-wells-fargo-cross-selling-scandal/

There is also a documentary about it, but having been trained in it myself? I know how it works already.

It's all about creating a halo product for your brand, and then getting people to spend as much money as you can (by extracting it from them).

I wouldn't mind if I were stupid enough to think I am very clever. That is usually how they catch people out. Appeal to their ego, etc etc.

Most of these cards Nvidia "release" will never be available. Or, will be available in such low numbers that even if you had the entry price you would still not be able to get one. And it all comes back to psychology. Right now Nvidia want to look as good as they possibly can, and obviously far better than their competition. So you release a few "daft cards" and then watch people buy any Nvidia card they can get because obviously they are the best.

And you would be absolutely amazed at just how well this works. And why companies solely base their entire ethics upon it.

Sadly there just aren't enough people like me who have used it, thought it disgusting and sworn off of it so badly that nothing they try will work.

Nvidia don't want you having this card. Why would they? it's all but assured if you got a 16gb pretty quick card with no flaws you would totally avoid what they have lined up for you to buy next. If you didn't? lol, Nvidia would not be charging what they are, nor did on Turing. But, it all just goes to show how incredibly well these methods are and how glaringly stupid people can be. And you can't stop stupidity. It's like a virus.

The general consensus right, among pretty much all of the "influencers" right now (GOD DAMN IT I HATE THAT TERM) is not to buy an Nvdia card at these prices. Every one is telling you not to buy an Nvidia card at these prices.

Sadly no one is listening. At all. Which is why they have once again broken sales records and once again will keep upping their prices until they find the cap like Apple did a couple of years ago. All of the time they can keep fooling people? they'll just sail on ahead.
 
What do you mean by this really? How is 10GB not enough on a 3080 for example? Isn't it dependent on which resolution you play on?...
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.

Also there is the secret MSRP that Nvidia doesn't disclose. They can grab the piece of that overpriced cake for themselves. You can bet that the real MSRP is not that far from street prices of "new" GPUs.


And they want their marketing materials to look competitive when Intel lounches their GPUs.
 
Last edited:
Sure if you play on 1080p extra memory on these won't matter but then you're just wasting a 3080. Unless you play on a 360hz monitor. But few people are doing that compared to 1440/4k.
 
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.

Also there is the secret MSRP that Nvidia doesn't disclose. They can grab the piece of that overpriced cake for themselves. You can bet that the real MSRP is not that far from street prices of "new" GPUs.


And they want their marketing materials to look competitive when Intel lounches their GPUs.

The way they see it is that as soon as people realise that's OK. They will buy a more expensive card next time.

That is how much contempt you can treat someone with now. And get away with it.

I would love to see people actually knuckle down and resist, but as we can clearly see that has not been happening. No matter what they do.

It's all a part of leaving you with regret. IE, maybe you should have bought that 3090 after all. That is how everything down the stack works.

This is all a part of getting people to live with compromise when their ego will not allow it. It's all a part of that side of psychology. And sales and money are big business and thus you would be astounded at the filthy tactics and even psychology being used in selling things.

Like I said in my last post, I would love to see them get their comeuppance. But they won't because they appeal to people's egos. Many celebrated the death of companies like 3DFX. Mostly because they just didn't care. Now? people created a monster, so don't be surprised when that monster commits more atrocities. That is what happens when you allow a monster.

We're all slowly being bent into realising how fruitless it is to complain about anything. Most companies now don't have a phone number for you to call when you need service, because that service no longer exists. You are just expected to swallow things and "smile along" when you get screwed over. Because complaining and being unhappy? well that's depressing and miserable ! don't do that, just take it in the a$$ with a smile on your face.

Maybe now I am erring on the super negative side. There is but one positive I can see. Intel. Problem is? we will just create another monster, as we always do. And I have seen first hand what a monster they can be when they are geared up with the best products.

I'd love to take their recent changes positively. IE, listening more and sorting out the way they approach things (rather than with deaf ears). They are listening now. That is clear. However, how much of that listening is because their CPU department has been hammered? I have no idea. They did so many "unfriendly" things in the past it was unreal. In case you don't remember? CPUs they could not solder because it was impossible, as the node made the dies too small. Which has effectively been seen as utter bull crap because they soldered other CPUs on the same node. Alderlake? soldered. Even at their smallest node yet.

It was all about saving 1p per CPU sold, or whatever it cost. Skimping, basically. Just to make maximum profit.

However. That has now changed. A reality slap perhaps? IDK. Some in the industry say they do care about looking bad, which is a start I guess. They also seem to have good enthusiasm, and have been listening. Which is why their GPUs all have stupid names, because they have been allowing gamers to get in on the creation of these products. Which again? is a very good sign.

However. Yet again will they keep up with this? or, just get back in the lead and going back to being the ahole monster they always were. Putting profit above absolutely everything else, no matter how bad it made them look.

Right now? I don't care really. I suppose like many others I am now finally beginning to come out of the stockholm syndrome that Nvidia have basically imposed on any one. IE, they are the enemy but they have been holding us in their grip and treating us like crap for so long that people are still very fond of them.

Even if Intel manage to stop that a bit? then that is a positive.
 
Nvidia dug their grave on this. They advertise 3080 with 4k gaming and production all over. 10 GB is not enough for 4K gaming in some cases, and not enough for 4k+ video editing. Many renders will fail on 3080 because they don't have enough mem. And most importantly AMD equivalents have 16GB.

But seeing as 4K isn't that big yet, what about 1440p though? Aren't 10GB enough even by todays standards?
 
But seeing as 4K isn't that big yet, what about 1440p though? Aren't 10GB enough even by todays standards?

4k is big when Nvidia want it to be big. Like when it suits them. As soon as the 980ti came out they started banging the war drum. However, it seems people do have a limit. That or they went 4k, found out how hard it was to run and how much money you would continually have to keep spending every year just to keep your games running and went back to 1440p (like I did). Went from a Fury X to two Fury x (less said about that the better) to a single Titan Maxwell to a Titan XP. Before realising it sucked and got a 1440p monitor again.

Right now their war drum is clearly RT, DLSS and their "new" proprietary Gsync module. Again.

IE they will sell you any old s**t they can and think they did well. Only this sort of stuff never usually catches on and silently goes away until they can use it again.

Already been down the Gsync module path. AMD come along with Freesync and force Nvidia to do their non module version. Only now AMD have gone quiet it's back to modules again. Mostly because I would imagine RT and DLSS are not being raved about as much as they wanted them to be. There goes you killer app, move onto something else.

One thing is clear about their behaviour and that is that whilst people may have been stupid, desperate or needing to self congratulate themselves so badly they would pay a current Ampere GPU price, but it's also been made pretty damn clear that no one really cares about Nvidia's RT and DLSS. They just want faster cards that can run their games better, and rasterization is still more than enough for that.

And this is why I think Intel will do OK. Because it's already very clear that no matter how crap RTG has become they are still hanging in there and haven't gone away.
 
4k is big when Nvidia want it to be big. Like when it suits them. As soon as the 980ti came out they started banging the war drum. However, it seems people do have a limit. That or they went 4k, found out how hard it was to run and how much money you would continually have to keep spending every year just to keep your games running and went back to 1440p (like I did). Went from a Fury X to two Fury x (less said about that the better) to a single Titan Maxwell to a Titan XP. Before realising it sucked and got a 1440p monitor again.

Right now their war drum is clearly RT, DLSS and their "new" proprietary Gsync module. Again.

IE they will sell you any old s**t they can and think they did well. Only this sort of stuff never usually catches on and silently goes away until they can use it again.

Already been down the Gsync module path. AMD come along with Freesync and force Nvidia to do their non module version. Only now AMD have gone quiet it's back to modules again. Mostly because I would imagine RT and DLSS are not being raved about as much as they wanted them to be. There goes you killer app, move onto something else.

One thing is clear about their behaviour and that is that whilst people may have been stupid, desperate or needing to self congratulate themselves so badly they would pay a current Ampere GPU price, but it's also been made pretty damn clear that no one really cares about Nvidia's RT and DLSS. They just want faster cards that can run their games better, and rasterization is still more than enough for that.

And this is why I think Intel will do OK. Because it's already very clear that no matter how crap RTG has become they are still hanging in there and haven't gone away.

Fair enough, but people keep talking about RT and DLSS. Yet no one talks about Nvidia's hardware implementation for Nvenc encoding and hardware based streaming feature. Which to my knowledge, AMD doesn't have?...
 
I'm a little confused why they just refreshed the 3080 with 12 GB only to bring a 3070 Ti with 16?! Is it a costs thing or why did they leave 4 extra GB on the table for the 3080?
 
I'm a little confused why they just refreshed the 3080 with 12 GB only to bring a 3070 Ti with 16?! Is it a costs thing or why did they leave 4 extra GB on the table for the 3080?

It's tied to the memory bus width. The 3080 could theoretically have 16GB of VRAM, but it would then have to have a weird, possibly unmanfacturable memory bus width. For instance, 256-bit bus is 4, 8, or 16GB (6800XT for instance could be sold as an 8GB card, but not as a 12GB card unless they chopped the 256-bit bus up). 384-bit bus is 6, 12, or 24GB. You can cut that up as the 3080 did, but it's not always practical, seemingly.
 
It's tied to the memory bus width. The 3080 could theoretically have 16GB of VRAM, but it would then have to have a weird, possibly unmanfacturable memory bus width. For instance, 256-bit bus is 4, 8, or 16GB (6800XT for instance could be sold as an 8GB card, but not as a 12GB card unless they chopped the 256-bit bus up). 384-bit bus is 6, 12, or 24GB. You can cut that up as the 3080 did, but it's not always practical, seemingly.
The more you know! Thank you!
 
It's tied to the memory bus width. The 3080 could theoretically have 16GB of VRAM, but it would then have to have a weird, possibly unmanfacturable memory bus width. For instance, 256-bit bus is 4, 8, or 16GB (6800XT for instance could be sold as an 8GB card, but not as a 12GB card unless they chopped the 256-bit bus up). 384-bit bus is 6, 12, or 24GB. You can cut that up as the 3080 did, but it's not always practical, seemingly.

Interesting, thanks for the lesson!
 
You need Quadro for rendering, or whatever they are called now.


no it is not needed.

but nvidia provides better drivers for quatro cards.
better as in "optimized for cad / 3d modelling apps".


the GPU renderer i use (mostly vray) work just fine with normal gaming cards.
i have 2 workstation and 6 renderslaves with cards from GTX 1080 to RTX 3090.


https://www.youtube.com/watch?v=94mMUV0nlps


i would have never bought a 3090 but one of my renderslaves 1080 broke. :huh:
needing a new card i bought a new one for the main workstation.


i really hate nvidia as a company but their cards give me the least headaches for what i do.
if i could i would not have bought 3xxx series card and support their shenanigans.
 
Last edited:
Back
Top