Nvidia RTX 3090 GPU leak points towards an eye-watering price tag - 24GB of GDDR6X?

Unless they had a way to change that....
Only so much Nvidia can do, games also need to support it. But they're trying with CFR. Which is a waste of time since it's such a niche setup that's complex to write code for. The idea should be canned from PC gaming.
 
I'd expect them to have plans for plenty of supers, once they start to not sell as many datacenter cards, they will just turn them into a titan and charge you 3k and the worst part is plenty will lap it up.
 
I'd expect them to have plans for plenty of supers, once they start to not sell as many datacenter cards, they will just turn them into a titan and charge you 3k and the worst part is plenty will lap it up.

I think RTX Titan was a push too far. So was Volta.

In both cases I think I knew like one guy who bought them and that was Kaap and he collects them.

All of the old fanbois seemed to vanish when RTX came out. Well, actually no, many of them got peed off because the Titan XP was so much more expensive than the 1080Ti, and they were finding it hard to go without for months at a time just to show off.

A couple of the forums I used to frequent used to be absolutely buzzing with posers showing off. Now? they are a shell of their former self. Mostly because the 2080Ti was simply too expensive for many of them. And how can you really show off if you are limited to a 2070 or 2070 Super? there was just this massive hole between those and the 2080Ti.
 
Haha these things are going to be savage power pigs.

Just watched Adored's latest video. Why has no one else mentioned the 12 pin? or did I miss it? apparently EVGA are going to be using it.

400w GPUs lmfao.

https://www.techpowerup.com/269957/...r-its-real-and-coming-with-nvidia-ampere-gpus

Right, so from my understanding of the article in your link, todays PSUs doesn't work with the new graphics cards? Due to none having a 12 pin connector as I know off? And dual 6 pins doesn't fit either.

Or am I mistaken here? :huh:...
 
Right, so from my understanding of the article in your link, todays PSUs doesn't work with the new graphics cards? Due to none having a 12 pin connector as I know off? And dual 6 pins doesn't fit either.

Or am I mistaken here? :huh:...

You need two 8 pins to a 12 pin. There are too many lives for two 6 pin from the look of it, or not enough grounds.

EVGA will be using these, whether it will be just one? IDK. From what we have seen of the 3090 PCB it has 3x 8 pin.

https://www.youtube.com/watch?v=NTGkW9cRUKI

That explains why. Basically they wanted to save money by getting a cheap deal from Samsung (which is also why it has been delayed and taken two years) so basically? desktop cards are going to eat power like it's going out of fashion.

They are supposed to be very fast though, so there is that. However, power consumption always comes with heat. Always.
 
yep the 12 pin is a new standard not sure if it's a must or if adaptor heard about this a few weeks back but if it means a new PSU then I'm out.
 
yep the 12 pin is a new standard not sure if it's a must or if adaptor heard about this a few weeks back but if it means a new PSU then I'm out.

It's not a new standard. That is why Nvidia are not using it.

Do, however, expect the 3090 to have 3 8 pin and consume 400w+. If your PSU can handle that you should be good. I would expect the 3080 to have two 8 pin then.

I would also expect 6 pins to die out too. Mostly because, IIRC, they don't have the ground cables to handle that sort of power without melting wires.

Edit. It all of a sudden explains that cooler though. And why it costs $100. It's nearly all metal.
 
Last edited:
Even if I won the lotto I'd still not buy a 3090 :D

tbh it has been looking like I was going amd this time for a long time so nvidia can keep the power hungry cards real doubtful that they can change my mind to even buy a 3070 at this stage.

I'll watch on the 1st mainly cause I want to see if he'll say it just works the more you buy the more you save and if he has a new meme :D

Do we need a new egg cooker maybe nvidia is in the wrong sector.

I'm not in a hurry as such, my 970's are old now but they still do what i need but i would like raytracing it'll be needed, but im just as happy to give amd a go and i'll wait for them to release and know which is the better deal and the 3rd option of the 3 is just buy a 2nd hand 2080ti they will be flying off the ebay at a lot cheaper price.

It might be cheaper for them to make but we won't see a penny of it, it just works the more you buy the more you save :D
 
Last edited:
Even if I won the lotto I'd still not buy a 3090 :D

tbh it has been looking like I was going amd this time for a long time so nvidia can keep the power hungry cards real doubtful that they can change my mind to even buy a 3070 at this stage.

As of right now we don't need all of that power for anything. At all. I mean gaming power, BTW, not melting the national grid.

I just don't see that changing any time soon either. Maybe as I said about 4k, RT and etc etc? then maybe you will "need" it, but that is the exact reason why I dumped my 4k monitor bloody years ago and have been on 1440p for about 6 years.

Even with RT on max I am getting over 100 FPS. On Metro Exodus and etc. Like, the RT games (even COD MW).

I could easily play them at 4k.

Which kinda begs the question WTF? I mean, Nvidia have been working on DLSS to make games faster and the second revision like, TOTALLY works. It's really, really good. So why would you need to have all of that power for games and then use DLSS? I just don't get that at all.

I mean whoopee doo right, you could play Metro Exodus at 300 FPS? that would make it so much betternot.

Edit. Also note - "Short supply high prices". On a tech that they got cheap. I mean, Jen even came out and said it himself that they will be much cheaper to produce than TSMC.
 
As of right now we don't need all of that power for anything. At all. I mean gaming power, BTW, not melting the national grid.

I just don't see that changing any time soon either. Maybe as I said about 4k, RT and etc etc? then maybe you will "need" it, but that is the exact reason why I dumped my 4k monitor bloody years ago and have been on 1440p for about 6 years.

Even with RT on max I am getting over 100 FPS. On Metro Exodus and etc. Like, the RT games (even COD MW).

I could easily play them at 4k.

Which kinda begs the question WTF? I mean, Nvidia have been working on DLSS to make games faster and the second revision like, TOTALLY works. It's really, really good. So why would you need to have all of that power for games and then use DLSS? I just don't get that at all.

What GPU are you using exactly when you're talking in these 1440p games resolutions exactly?... And what do you mean with the section regarding DLSS?
 
What GPU are you using exactly when you're talking in these 1440p games resolutions exactly?... And what do you mean with the section regarding DLSS?

I am talking about how much DLSS improves performance, especially when using RT. It doesn't even look like poo any more.

I am using a 2080Ti as you know. Or should, if you read the forum dude.

Before that I had a Titan XP, and it too is still enormously capable of 1440p.

One thing I know for sure? forget all about shoving one of these sausages in your ITX rig. S**t just ain't happening.
 
I am talking about how much DLSS improves performance, especially when using RT. It doesn't even look like poo any more.

Ah I see.

I am using a 2080Ti as you know. Or should, if you read the forum dude.

I do read the forums, I just don't keep tabs of everyones PC specs and their changes. Which I highly doubt you do either.

Not sure what's up with the attitude?...

One thing I know for sure? forget all about shoving one of these sausages in your ITX rig. S**t just ain't happening.

The SFF community will find a way to do it anyway.
 
I don't have an attitude?

Maybe I just have a good memory is all.

As for finding a way to do it? tbh? putting a 400w GPU in a ITX rig just isn't going to happen unless it's a wind tunnel.
 
The 12-pin connector is a PCI-SIG standard I read originally but not one expected to be used on this generation(But apparently it's an NVidia piece now), was posted about in Quick News a couple months ago now I think
 
Last edited:
Ah I see.



I do read the forums, I just don't keep tabs of everyones PC specs and their changes. Which I highly doubt you do either.

Not sure what's up with the attitude?...



The SFF community will find a way to do it anyway.

I don't think Alien meant it condescendingly. He calls everyone dude. I know it kinda read that way, but he meant well.

If older rumours are to be believed, Nvidia are worried about RDNA 2 and plan to construct a larger Ampere card, creating a potential RTX 3090 Ti.

It's hard to know what Nvidia are doing. TBH, if they do a "SUPER" series again, removing the Ti would be a good move. No more, it the RTX 2080 Super better than the RTX 2080 Ti nonsense.

A 3090Ti? WAH?!
 
I have this really sneaky feeling that those selling their 2080Ti for £800 in order to get more power per performance are going to be sorely disappointed.

The way this launch is looking? you will pay for every extra frame you get. That has to leave the 3080 at what? a grand? £1100?

Nope .$699 :-)
 
I don't think Alien meant it condescendingly. He calls everyone dude. I know it kinda read that way, but he meant well.

I missed this. Yeah, I call near on every one dude. That's what you get from living in the states for the best part of 9 years :D
 
Back
Top