Small talk & Chit chat

I knew someone would post it, so i didn't need to cause lets face it i've not really been a fan of the new cards, you can't disagree with what he says in the video thou.

With Nvidia covering their ass with the 3070 launch now, yea they are worried otherwise why not put it out when stated, like they really care about any of the guff they put out in the statement they made.

Roll on Navi 2x i really hope they shake things up.
 
Their biggest leap was 7800GTX to the 8800GTX. Ever. It was enormous.

At least with Turing you can give it a by because it had RTX and other features for the first time, but there is no excusing the prices really.

So you would expect them, having had a "bad round" to fix it and have a good one. It was barely any better, and neither are the prices. In fact in the case of the 3090 they are worse.
 
That graph compares the wrong tiers though.

The 3090 is the Titan level card.

The 2080 Ti should be compared to the 3080 which is half the price...
 
That graph compares the wrong tiers though.

The 3090 is the Titan level card.

The 2080 Ti should be compared to the 3080 which is half the price...

Sorry man but that is rubbish.

There is no Titan card. And until they can get that quite frankly ludicrous TGP under control there won't be. Who would pay Titan money for a card that guzzles power? so badly you can't even put it in a server.

It's nothing but marketing nonsense. The 3090 is the new Ti card. There is no Titan. If it were a Titan level card then it would be called a Titan. Period.

If it were it would be even worse, because it would be even more expensive. The fact is it is more expensive than the 2080ti and in performance it is totally "meh" when you look at their past.

3090 is just a play on model numbers, and the BS talked about it being a "BFGPU and the Titan replacement" are nothing but marketing. If Nvidia released it as a Titan to professionals they would get laughed off the planet.

As for the 3080 being "half the price"? I see that marketing tactic worked perfectly on you. It's not half the price. It's the same price as every other card of its tier for years.

It's a simple case of playing "good cop bad cop". Bad cop = Turing. Force any one who wants the top end card to spend £1200 with no other cheap option. So no Titan at £1300 but a Ti at £700 or so. That was bad cop.

Then you come along with your worse than Fermi arch, talk a load of crap, switch the cups with the balls in and say "Hey look how CHEAP this card is !".

Um. It's not cheap. It's the same price as cards from previous gens, and you have inflated the 3090 price to a new high which makes your 3080 look cheap/er. Yah, it's cheapER than the 3090. That doesn't mean it is cheap.
 
The thing i absolutely do not agree with Ampere is the insane powerdraw that nvidia decided to settle on, if they had stuck to 250 watt maximum the performance increase wouldn't be as much as it is and we'd be laughing hard at nvidia for failing so hard going with samsung and their inefficient gpu dies, but this is kinda pathetic.


The 3090, while a beast of a card is just a huge flop, it doesn't even have features that were on the other titans so it cannot, will not be a titan level card, ever.
I'm hoping amd can bring the hammer down, they don't need to beat performance here, they just need to beat power and price with a good driver package and they can outsell nvidia quickly this round.
 
Um. It's not cheap. It's the same price as cards from previous gens, and you have inflated the 3090 price to a new high which makes your 3080 look cheap/er. Yah, it's cheapER than the 3090. That doesn't mean it is cheap.

Not including the mess that was Turing, the 3080 is actually £50 cheaper than a 1080 Ti was at launch.
 
Why does that graph skip the 680?

Not including the mess that was Turing, the 3080 is actually £50 cheaper than a 1080 Ti was at launch.


Not a fair comparison. Compare it to the price of a 1080 at launch instead. the 3080 is not the Ti of this generation. its clear. Even the 20gb coming will provide a performance bump.
 
Not including the mess that was Turing, the 3080 is actually £50 cheaper than a 1080 Ti was at launch.

And look at the performance leap on Pascal. It was the best they had done in a decade.

Warchild - it wasn't the top end flagship, just a diet impostor.
 
That apology email with 20% off was not OCUK. It was posted there, but seems to be something else. Amazon maybe, but people can't say because it's a competitor.

Just setting the record straight.
 
Just ripped out my NZXT smart device since I really dislike its habit of forgetting its settings when PC is taken off mains - I generally don't run background software unless absolutely necessary and rely on internal memory of peripherals.
Well, as it turns out, default NZXT case fans are DC, and all I've got is a splitter and a single CASE_FAN header to work with. :D Ordered some more Arctic PWM fans to accompany the current lonely 140mm fan, I've been looking for an excuse to get rid of those NZXT ones anyway.

The current crop of Artic fans are just so so good, performance, good acoustics and value rolled into one product is hard to beat :)
 
Just posted this in the video chat:

3090 is the titan, not the TI. And when comparing performance, you are forgetting the RT and tensor cores. So, the comparison between the older cards and the newer isn't as straight forward as you are trying to show it. Try running a game with raytracing and DLSS on a 1080ti vs 2080ti vs 3080. And then se how many percent it is better.
Agreed, looking across a suit of games, where most does not support the new features, it doesn’t matter. But still the 3080 is WAAAAAY cheaper than a 2080ti, and it is a lot better I AAA games now and in the future.
Let the Nvidia fanboy rant begin :D

So i totally agree with trawetSluaP.
Just because there has been Titan/ti's in the future, it doesn't mean that it will continue in that way.
Because of the power draw, i don't think it would make sense to make an entusiast Ti model (and the gaming performance isn't that much higher with the 3090)

The 3080 is capable of 4K with RTX and DLSS, so way make an even more power hungry card, for 15% you don't need.

Well i have back-ordered af 3080 TUF, and a EKWB block for it :D
 
There probably won't be a 3080 ti, and no Titan. So just one card = 3090

But i don't get the 20GB 3080 that is rumored. It will just steal from the 3090 sales.
But the yields are probably not good enough on the 3090's. Otherwise it does not make sense to me.
 
There probably won't be a 3080 ti, and no Titan. So just one card = 3090

But i don't get the 20GB 3080 that is rumored. It will just steal from the 3090 sales.
But the yields are probably not good enough on the 3090's. Otherwise it does not make sense to me.

There will be a Ti. There will also be models with more VRAM. At some point if they can get that wild TGP down there will be a Titan also. But don't for one minute think the 3090 is the Titan. that was just a way to get £200 more for a Ti model. You know? £200 more than your last 2080Ti simply by stepping up to 90.

The 3090 is not a gaming card. However, it's not a Titan because of the absolutely ridiculous TGP. Had Nvidia released that to professionals they would have gotten laughed at.

A 20gb 3080 will not steal 3090 sales. It will be priced between, and will probably end up the go to Nvidia card to have. That means more sales. It's smaller cheaper silicon too.

From what I am beginning to hear Nvidia should be quite worried. Especially as they are about to get absolutely spanked in the TGP part of their cards.

If the rumours are true the 10gb 3080 is about to take a hefty pounding too. So they will desperately need the 20gb model, given AMD's top end cards come with 16gb.
 
There will be a Ti. There will also be models with more VRAM. At some point if they can get that wild TGP down there will be a Titan also. But don't for one minute think the 3090 is the Titan. that was just a way to get £200 more for a Ti model. You know? £200 more than your last 2080Ti simply by stepping up to 90.

The 3090 is not a gaming card. However, it's not a Titan because of the absolutely ridiculous TGP. Had Nvidia released that to professionals they would have gotten laughed at.

A 20gb 3080 will not steal 3090 sales. It will be priced between, and will probably end up the go to Nvidia card to have. That means more sales. It's smaller cheaper silicon too.

3080 and 3090 use the same chip, both the top chip but 3080 cut down to a point where they could get reasonable yields. 3090 is the top chip with as much enabled as they can while still having seemingly any yields, ergo Titan imo. Could they eventually maybe release a chip with those 3% more cores enabled and a lower TDP? Maybe one day if yields vastly improve, but NVidia can't be planning on sticking with Samsung 8N for more than a year surely. Personally I don't see NVidia messing with the top of Amperes stack much going forward, besides VRAM configs. There's just not really room for a "Titan" or a "Ti" model, they'd end up selling like 6+ SKUs around a single piece of low yield silicon.

Fact is, the driver, ECC memory, ect situation meant Titan's practical use cases for professional workloads were incredibly niche anyway, vast majority likely went to gamers will a big wallet, so I wouldn't be surprised if NVidia have just dropped the "prosumer gamer" thing.
 
Last edited:
But do you need more than 10gb of VRAM?
Most gamers use 1440p rather than 4k, and i think i will stay like that for a while.
I'm having a debate with my self weather or not i should go for a 3840x1600 monitor.
But it is still a lot less pixels than a 4k monitor.
 
But do you need more than 10gb of VRAM?
Most gamers use 1440p rather than 4k, and i think i will stay like that for a while.
I'm having a debate with my self weather or not i should go for a 3840x1600 monitor.
But it is still a lot less pixels than a 4k monitor.


Will likely depend on the game, I just tested Horizon Zero Dawn which has some nice crispy graphics, At max settings 3440x1440 I saw an average VRAM usage of 8400MB so I can see people hitting the 10GB limit fairly quickly if they run new games over 1080P, I know VRAM allocation is also a thing but I cannot see 10GB lasting very long for anyone who plays at any res over 1080P for long.
 
Back
Top