RTX 3090 and RTX 3080 memory capacities confirmed - Expect BIG frame buffers

It's less than 24 hours until we get all the buzz word marketing BS as for some of the slides coming out they look far from right so i wouldn't base anything on them.

DXR is only just going mainstream in the next few months so everything before the coming games you could pretty much class it as Beta software.

But despite Nvidia's stream tomoz and something solid info wise, I'm actually far more intrested in AMD cause dam they are doing a real good job of saying nothing so good in fact Nvidia should and no doubt are real worried.
 
It is. Compared to Titan RTX and QUADRO RTX 6000 it is a no brainer purchase for a production system.

There's the caveat. The RTX 3090 will likely be marketed as a gaming card, right? But like the Radeon VII will be only be a "bargain" in production applications, not what it was designed and marketed for. If everyone knew that, cool. But they don't.
 
There's the caveat. The RTX 3090 will likely be marketed as a gaming card, right? But like the Radeon VII will be only be a "bargain" in production applications, not what it was designed and marketed for. If everyone knew that, cool. But they don't.

Then they will have the common sense to check their bank account surely dude?

Let's look at a X399 motherboard just briefly.

uFljFBm.jpg


There. Not for gaming at all. Way above many user's paygrade, smothered in logos and RGB.

It's like I said, the workstation class market (IE computers people use for design etc) all fell into one when the HEDT was born. Before all that? I used to run workstation class hardware (dual Xeon, etc etc). And it was ugly and green, and had no uses for gaming or anything else trendy.

I guess it is just easier for companies to do it that way. Less product stack and even names means less production costs.

The Titan was labelled as a gaming card the second they disabled the DP. However, it was still a very powerful tool for those looking at high end rendering without paying for all of the Quadro features and drivers and support.

People stopped buying Titans at the launch of the RTX Titan (well in fact no, that isn't true, they stopped when Volta came along). It's just too much money to show off with. Totally not worth it at all.

It's kind of a catch 22 I suppose. People demand massive high end gaming cards, yet Nvidia know there isn't very much they can do to stop professionals using them for their power in other ways. So two markets just blended into one.

It just takes a little common sense is all. Something I have used for years when spending my own money. GTX 2070 Super (for my mum's house) and a 2070 for home were my upgrades. £418 and £355 respectively. Both are more than good enough for pretty much every 1440p game. People just don't realise they don't need anything more than that.

Edit. Lest we forget you can actually game on a Quadro card, or a Tesla if you know what you are doing and pay the licenses for pass through support etc. However, what poser PC user is going to spend £5k on a GPU?

There are limits to this madness. People do have limits. The amount of 2080Ti owners on OCUK compared to the 1080Ti is laughable. Sales didn't go down, forum posers did.
 
Last edited:
People stopped buying Titans at the launch of the RTX Titan (well in fact no, that isn't true, they stopped when Volta came along).

Still got my Titan V and RTX Titans.:p

The Titan V makes every card in the Turing family including the pro cards look really bad when it comes to DP work.

As to your points about Ray Tracing, I totally agree, it has been an epic failure and a very bad example of "The Emperors new clothes".

I don't think Ray Tracing will really take off until we see the high end Hopper cards in a couple of years.
 
Still got my Titan V and RTX Titans.:p

The Titan V makes every card in the Turing family including the pro cards look really bad when it comes to DP work.

As to your points about Ray Tracing, I totally agree, it has been an epic failure and a very bad example of "The Emperors new clothes".

I don't think Ray Tracing will really take off until we see the high end Hopper cards in a couple of years.


RT may have been hit and miss in games, but it was, pretty much, beta testing and only bolted on games that weren't even properly developed for DX12 let alone DXR.


Where RT and Tensor Cores absolutely shine is Nvidia OptiX render. It demolishes CUDA and AI denoising in renders is magical.


Proper RT support will come in new releases, but we wouldn't have those if it weren't of so called "failed" implementations.
 
Back
Top