Nvidia RTX 3090 and RTX 3080 Specifications Leak

Well, I'll wait for benchmarks before passing final judgement but those specs on the 3080 are very underwhelming.



My 2080 has 512GB of Bandwidth so 768Gb on the 3080 is only a 50% up tick, which suggests actual in game performance will only be 20% or so better than the 2080Ti.


I really can't see moving from a 2080 to a 3080, being a worthwhile upgrade,TBH.


Only moving to a 3090 would make any kind of sense in performance terms, but absolutely no sense whatsoever in financial outlay to get it.


As I suspected all along, waiting for the 4080 in 2022/23 is the only sensible upgrade path from my 2080.


Oh well, I guess that shiny new OLED TV will be where I spend my money this year.
 
I cannot understand the 24 GB.

This obviously will cost a lot to produce - so why this massive increase? Is it needed to get the bandwidth/speed increase? Or are there games that needs this to hold textures? Seeing how people get along nicely on 8/11 GB now it seems like a larger increase than what normal 'evolution' would demand. Even 16GB on a top card seems generous.

Are there other uses / benefits for more RAM on the cards?
 
I cannot understand the 24 GB.

This obviously will cost a lot to produce - so why this massive increase?


Its more about giving Nvidia the option of releasing a more powerful 2080 Super/Ti in 2021 with more than 10GB of VRAM to counter any move from AMD and their RDN2 cards.

24GB serves no useful purpose, its a completely unnecessary amount of VRAM. It also maybe gives nvidia a bogus reason to justify the absurd price the 3090 will no doubt cost.
 
Last edited:
I cannot understand the 24 GB.

This obviously will cost a lot to produce - so why this massive increase? Is it needed to get the bandwidth/speed increase? Or are there games that needs this to hold textures? Seeing how people get along nicely on 8/11 GB now it seems like a larger increase than what normal 'evolution' would demand. Even 16GB on a top card seems generous.

Are there other uses / benefits for more RAM on the cards?

The higher you go in resolution the higher the memory bandwidth needs to be. Hence the multiples. Unless you derp the memory controller and decrease the bandwidth you would have problems.

As an example, the 1080Ti 11gb.

MEMORY BUS WIDTH 352 bits
MEMORY BANDWIDTH 484GB/s

VS Titan Xp.

Memory interface width: 384-bit
Memory bandwidth: 547.58 GB/s

So you can see the decrease by derping the memory controller and reducing the memory amount by just 1gb.

Now. Memory bandwidth was kinda important on the XP and Ti. However, nowhere near as important as it will be now. 4k is incredibly demanding on memory bandwidth, and the next gen games (especially RT) will be very demanding on the VRAM. So if they halved the VRAM to 10gb? then all of that power will be hobbled by the low memory bandwidth.

The bigger the textures get the higher that bandwidth needs to be. This is why AMD were so stupid to use HBM, because at the time having all that bandwidth did squat. However, with PCIE 4 and the fact that NVME drives on PCIE4 are ludicrously fast they can finally utilise the storage speed too.

You know how for ages it wasn't worth storing a game on a SSD because the load times barely improved, apart from a small handful of titles?

Well you can totally expect that to change once these new consoles launch.

Why wasn't it improved before? because the consoles did not come with SSDs.

You also need to bear in mind that even though the PC has seen a massive uptick in users (because of games like Fortnite and PubG and the popularity among younger gamers) they (the games) are still coded primarily with the consoles in mind. Hence, any improvements for the PC need to be added in later (IE much faster storage, multiple GPUs, remember those? ) and so on, which they usually do as little as they need to do.

Apart from Rockstar, who genuinely do put in a lot of effort on their ahem, "ports".

Its more about giving Nvidia the option of releasing a 2080 Super/Ti in 2021 with more than 10GB of VRAM.


24GB serves no useful purchase other than that and maybe a bogus reason to justify the absurd price the 3090 will no doubt be priced at.

That is not true. See the above, and what happens when you derp the memory by just 1gb.

There are many falsehoods doing the rounds at the moment as to why Nvidia are "ripping us off innit" and "their margins are higher than evarrr !" which both of which are BS.

Apparently Nvidia work to a 60% margin. Always have, always will. The reason GPUs have gotten expensive? because we keep demanding more and more performance. So they deliver it.

Also, like I did elsewhere I will try to explain why the 2080Ti cost so much. Again, these are facts, ignore them if you like but it's very narrow minded to do so !

1. Nvidia were not going to release Turing. It was Ampere. Samsung's node failed and it was delayed, and Nvidia had to wait for Samsung to change the node to make it even useable. It started as 8nm, now it is 7. So they needed a whole node just to get it to work.

2. The 2080Ti die was absolutely frigging ENORMOUS. 772mm2. Compare that to the 1080Ti? it was 440 odd mm2. So that is about 35-40% larger than Pascal.

3. Nvidia did not want to use TSMC, as they were involved with Samsung. Why? because TSMC are really expensive. Turing was basically a slightly shrunken Pascal on TSMC with the tensor cores bolted on, hence the massive die. Massive dies are monolithic, cost a fortune, failure rates soar (because one bad area means dead core).

4. Nvidia had a deal with TSMC to provide only working dies. IE, TSMC would swallow some of the dead ones. This meant that basically TSMC would have added in at least some of that cost on the 2080Ti die. There's no way they would have swallowed it all.

5. TSMC *are* expensive. AMD are OK, because they go for Ryzen with its chiplet design meaning lots of working dies per wafer. However, as already explained the 2080Ti die was bloody huge. Meaning huge cost.


And that is quite probably why Turing should get spanked by Ampere because Ampere was the ground up design, not Turing. As I explained, that was a slightly shrunken Pascal on a massive die with tensor cores.

Hence the supposed enormous uplift in RT performance.

To add.

These Samsung dies are not as good as TSMC, BTW. The enormous 2080Ti used 250w TGP. The 3090 uses 350w TGP.

The 3090 is a failed Quadro, but not in the usual sense. It is not a complete failure, it just uses way too much power. So it can't be used as a Quadro, as those things need to go in rack servers etc and need to behave perfectly when it comes to thermal power. You can not shove a 500w+ card (overclocked) into a server.

With us, the home users? they will leave that to us. Water blocks, loads of air flow, big ass coolers etc etc.
 
Last edited:
HaMwOaZ.png


haha best meme yet :D

$2k on Chinese store.

https://www.hardwaretimes.com/nvidia-geforce-rtx-3090-listed-for-2000-on-chinese-store/

Remember that fake 3090 PCB pic I posted? well that was the Colorful.
 
Well, I for one am quite disappointed. I will just stick with my 2080ti's and NVLink for now. I had hoped to see a bigger tock, not just a small tick compared to our current-gen cards. As always though, let's see some real-world benchmark results before we get too carried away.
 
Well, I for one am quite disappointed. I will just stick with my 2080ti's and NVLink for now. I had hoped to see a bigger tock, not just a small tick compared to our current-gen cards. As always though, let's see some real-world benchmark results before we get too carried away.

From what I am hearing it's 40% on the 3090. That's hardly a small tick mate. That is huge.

However. If you don't need anything faster? then FFS don't buy it ! it's amazing how many people over spend on GPUs they don't need. I bought a Titan XP initially for 4k. It was great, but I changed my monitor also to 1440p shortly after. So the TXP was over powered for my use case.

Same again this time. Bought a 2080Ti, and a brand new 1440p monitor. Overkill to the extreme. Do I want a 3090? sure I do. Do I want to lose a grand and buy one? no, because I don't need it. I am absolutely certain the 2080Ti, as slow as it may be made to look by the 3090 is more than adequate for my needs for a very long time to come. Just like how my Titan XP lasted 3.5 years and was still hammering out brand new games at over 100 FPS ultra 1440p. I didn't even need to change that tbh, and maybe I shouldn't have but yeah, I am done spending now for at least 3 years, by which time I will probably be out of the race completely because consoles will be awesome.

BTW a source in the industry today (someone I chat to on another forum) has confirmed the spec, so they are apparently 100% legit.
 
Last edited:
24gb

GPU offline rendering/lightmapping needs as much ram as possible, so this will be better for game makers than game players.
 
Back
Top