RTX 3090 and RTX 3080 memory capacities confirmed - Expect BIG frame buffers

€1500 is a bargain?

For what you are getting? IMO yes.

You are getting an almost full blown Quadro.

Context. No one actually needs one of these for gaming. I would surmise the 3080 will be more than good enough for that.

Renderers? streamers? etc etc? encoding decoding and so on? I think it's a bargain. In context.

The 24Gb serves nothing except the desire to have a halo product. A 16GB 3090 would've been a justifiable buy, now there's a few hundred bucks of air in the price for gaming.


Oh well, 3080 looks fairly good so far, hope it's going to undercut 2080 Ti significantly.

That is not true. The 24gb creates insane memory bandwidth. That is why it is not cut down.

If you buy one of these for gaming you are quite literally bonkers, or you have too much money, or you have a huge ego.

Look at the facts. Dice is using what? a 5700XT right? I was using a 2070 and 2070 Super for 1440p (ignore the 2080Ti, I didn't pay for it !). That is and was more than enough, and mine cost £355 and £418 respectively.

Nvidia are worried about AMD, and so they are unleashing the beast. Like, totally letting the animal out of the cage. As usual, you.don't.have.to.buy.it. No one is forcing you, and there will be plenty of cards that will pump out games like a porn star for much less.

You are forgetting the people who will buy this for legitimate reasons. Game devs, people who encode and decode, people who render etc etc. The workstation fell into the PC envelope years ago now. You know? HEDT. Another thing that you simply did not need unless well, you need it.
 
Last edited:
You only need so much bandwidth - especially with Nvidia's compression.


And I'd like that sort of processing power for strictly gaming, running Valve Index at 144Hz is fairly demanding, but of course dropping settings works fine.
 
This !

I refuse to buy a GPU, Again, That is north of £800, The prices are getting ludicrous and Nvidia can try to make all the excuses they want but every 2080 Ti that sold at £1200 was sold with a fairly large chunk of profit.

I know I'm quite likely piddling in the wind but I really hope AMD and even Intel start coming out with powerful GPU's that start to bring the prices back down to sane levels because the only reason Nvidia are selling GPU's at these prices is because there is no competition at that level.


They can do that becausr they are the best. The same way AMD is selling every single Threadripper CPU for a quite large sum. Untill AMD offers some competition in GPU market it will be the same.
 
This !

I refuse to buy a GPU, Again, That is north of £800, The prices are getting ludicrous and Nvidia can try to make all the excuses they want but every 2080 Ti that sold at £1200 was sold with a fairly large chunk of profit.

I know I'm quite likely piddling in the wind but I really hope AMD and even Intel start coming out with powerful GPU's that start to bring the prices back down to sane levels because the only reason Nvidia are selling GPU's at these prices is because there is no competition at that level.

Then buy the 3080. It's meant for you.

You only need so much bandwidth - especially with Nvidia's compression.


And I'd like that sort of processing power for strictly gaming, running Valve Index at 144Hz is fairly demanding, but of course dropping settings works fine.

The 3080 should be more than up to the task of that, IMO. If anything you are getting more than 2080Ti power for £500 less.

It makes me LOL how many people get so mad over this. Really.

It's like stomping your feet and complaining that Ferrari don't make cars affordable for all. Or Rolls Royce etc.

From the sounds of it? the 3080 is going to be overkill. Total overkill, just like the 2080Ti. Unless of course you absolutely insist on mega 4k gaming. In which case? get your wallet out.

Been there, seen it, learned it, got rid of it. I dumped thousands chasing the dragon. Not doing it any more.
 
Last edited:
Then buy the 3080. It's meant for you.



The 3080 should be more than up to the task of that, IMO. If anything you are getting more than 2080Ti power for £500 less.

It makes me LOL how many people get so mad over this. Really.

It's like stomping your feet and complaining that Ferrari don't make cars affordable for all. Or Rolls Royce etc.

From the sounds of it? the 3080 is going to be overkill. Total overkill, just like the 2080Ti. Unless of course you absolutely insist on mega 4k gaming. In which case? get your wallet out.

Been there, seen it, learned it, got rid of it. I dumped thousands chasing the dragon. Not doing it any more.

No one is getting mad in the slightest. And I personally think we have re-entered the era of high refresh, not 4k. 4K resolution is just a common thing now. And even still, mainstream don't care for it, as 1440p seems the common ground. I personally believe those wanting 4k as a priority are in the minority now. Most want the higher refresh rates.

Look at all the crazy high fps monitors now. Samsung releasing them with 240hz panels on most new products, Acer/Asus have their 200hz premium VA, along with the world first 360hz 1080p. Other brands following suit. It seems all new screens are 144hz minimum or 165hz.

We just went full circle and returned to the refresh race. All I hope is that we don't move on to the 3D Vision again :P Luckily we have VR to prevent this.
 
We are always going to have something pushed on us that apparently we can't do without. That is marketing. Had I listened to it? I would have bought a 4k 120hz panel.

Why though? I don't need one LOL. In fact, I went out and bought another 1440p panel with a refresh rate double to what I had, with much better features etc.

I chased the 4k dragon for years. It was really stupid. First I paid £550 for a crap 4k monitor (Acer, TN) and then I bought two Titan Black for £1400 (sound familiar? cool). If SLi worked it was enough. If it didn't? like in Wolfenstein? it was bad. 23 FPS. So, I threw more money at the problem and bought another Titan Black for £430 used. Again, if SLi worked it was cool, but it didn't more often than not.

So I bought two Fury X for over a grand, which ran out of VRAM all of the time and crashed. When Crossfire worked, of course.

That was enough for me. I said uncle, wound down the res to 1440p and have been there ever since, regardless of what Linus or the rest of those prats tell me. I avoided high refresh for what felt like forever because again, I didn't need that either. Main use cases? slow paced RPG. If you hack those to work at higher than 60 FPS they don't behave too well, and besides it made absolutely 0 difference to the game itself.

Doom was nice at 240hz, but the cost of TN was simply too much. I gave that Alienware display to my brother two weeks ago, and replaced it with my 70HZ 1440p HP at my mother's, with the new curved 144hz monitor going at home.

I consider both to be suitable upgrades :) Doom is equally as fantastic at 144hz tbh.

4k is chasing the dragon. You become reliant on the big high end GPUs and I got priced out of that game ages ago. It didn't even bother me, because before I ran out of money for it I had downgraded to 1440p any way, and bought a Titan XP for £675. Which was a lot of coin, but I had it over three and a half years.

I could have sold it easy for £450, but I gave it to a friend of mine.
 
What about getting a 4K monitor, but use for 1440p? Then you have the 4K option if you ever would ever feel the need, wish or would like to try 4K down the line.

Like for example LG's new LG 27GN950 monitor, that is 4K at 144Hz. You can always go down in resolution on those monitors if you so wish.
Although, I'm not sure at all how that works in real life... Is this even possible or will it just look like poop?
 
What about getting a 4K monitor, but use for 1440p? Then you have the 4K option if you ever would ever feel the need, wish or would like to try 4K down the line.

Like for example LG's new LG 27GN950 monitor, that is 4K at 144Hz. You can always go down in resolution on those monitors if you so wish.
Although, I'm not sure at all how that works in real life... Is this even possible or will it just look like poop?




you took the words right out of my mouth, was gona ask alien the same thing

im slowly wanting to level up my hardware not right this min but soon


the monitor is what i have been pondering on , get a tv or a monitor ? 32'' is what im looking for in size, with these new gpus i deffo want to upgrade my 1070 and if so i want to be able to get in on some 1440p action
 
What about getting a 4K monitor, but use for 1440p? Then you have the 4K option if you ever would ever feel the need, wish or would like to try 4K down the line.

Like for example LG's new LG 27GN950 monitor, that is 4K at 144Hz. You can always go down in resolution on those monitors if you so wish.
Although, I'm not sure at all how that works in real life... Is this even possible or will it just look like poop?

I've read that monitor doesn't have HDMI 2.1 or DP 2.0 so can't actually do the 4K/144Hz.
 
I've read that monitor doesn't have HDMI 2.1 or DP 2.0 so can't actually do the 4K/144Hz.

If that is true, then that is just so bad. Might even be false advertisement? Or more like a "technicality" such as with the 27GL850 monitor from LG.

It's specced at 1440p with 144Hz and 1ms response time. When in reality, you need to use the "Fastest" preset on the monitor to achieve that 1ms. Yet when doing so, you get ghosting etc. So lowering to the "Fast" preset, you get more like 5ms instead, but no issues.
 
If that is true, then that is just so bad. Might even be false advertisement? Or more like a "technicality" such as with the 27GL850 monitor from LG.

It's specced at 1440p with 144Hz and 1ms response time. When in reality, you need to use the "Fastest" preset on the monitor to achieve that 1ms. Yet when doing so, you get ghosting etc. So lowering to the "Fast" preset, you get more like 5ms instead, but no issues.

Apparently it's 4K/120Hz without DSC, or 4K/144Hz with DSC.
 
What about getting a 4K monitor, but use for 1440p? Then you have the 4K option if you ever would ever feel the need, wish or would like to try 4K down the line.

Like for example LG's new LG 27GN950 monitor, that is 4K at 144Hz. You can always go down in resolution on those monitors if you so wish.
Although, I'm not sure at all how that works in real life... Is this even possible or will it just look like poop?

When I bought my 4k monitor it was TN. There were no VA/etc. It also came with severe caveats. Remember, this was still Windows 7 and 8 times. Even though 8 had basic scaling? it was still awful. Many of my paid apps (I used to pay for Photoshop) refused to scale, and tbh? I could not read the text. It was so small. Games like Fallout 3 did not scale the text, so you simply could not read it and so on.

If anything it just taught me a very valuable lesson. Stop listening to s**t companies say.

The fact is that now I do not need a 4k monitor. Not for anything. I don't use the PC for work or my older hobbies (photoshop and fitting loads of code on a screen) and thus dropping to 1440p is more than enough.

Your point is very valid though ! yes, running a lower res in gaming shouldn't be an issue at all. I just don't need a 4k desktop. 32" 1440p is the sweet spot for me, hence why I stayed there when I bought a new monitor (I just bought a double the hz curved one instead).

And yes, I could have ran games at less than 4k on that horrid Acer, but horrid Acer was horrid. In absolutely everything I was doing at a desktop level.

I got off the 4k train then. I am so glad I did. If my cards had the muscles? I would simply run DSR and run them at 4k. So they are not quite full 4k pretty well I tell you what they looked better than 1440p.

I just think 1440p is still the perfect resolution and the absolute best all rounder for pretty much everything. And it's cheaper when you come to buy a GPU too :D the 2070 was a drop from my Titan XP but I never noticed a thing. Maybe 10% tops in gaming performance, but when you had a 70hz monitor you would never have even known.
 
The argument that $1500 is a bargain next to quadro, i believe to be invalid. If you actually need a quadro unfortunately you just have to buy one. For companies that have large CAD departments, the IT department most often will go with an OEM that offers guaranteed support of the programs you intend to run, for instance the autodesk suite. You buy or lease a workstation and a support package for a fixed term which makes all the problems you might encounter someone else's and the cost in lost productivity the problem of the supplier.

The quadro's exist in a very different market space to the consumer cards, i'm not sure how much the pricing of one affects the other. To use the price of a quadro to justify the high price of a consumer card is like comparing a consumer truck/4WD with a prime mover.
 
The argument that $1500 is a bargain next to quadro, i believe to be invalid. If you actually need a quadro unfortunately you just have to buy one. For companies that have large CAD departments, the IT department most often will go with an OEM that offers guaranteed support of the programs you intend to run, for instance the autodesk suite. You buy or lease a workstation and a support package for a fixed term which makes all the problems you might encounter someone else's and the cost in lost productivity the problem of the supplier.

The quadro's exist in a very different market space to the consumer cards, i'm not sure how much the pricing of one affects the other. To use the price of a quadro to justify the high price of a consumer card is like comparing a consumer truck/4WD with a prime mover.

Too one sided. "IF you need a Quadro you unfortunately have to buy one".

You have missed "IF you don't need a Quadro you unfortunately have to buy one".

What I mean is if you are a workstation user and need as much power as you can throw at a problem (encode, transcode, streaming, rendering) then you don't need a Quadro. At all. The only difference is the driver and the support, that is why the high cost.

However, there are loads of uses for the fastest GPUs aside from gaming and everything else. Why do you think the Titan still sold in droves after they disabled DP?

I hate the use of the word justify. It's been proven to be incorrect for years. When you are running a business it isn't about justification it is about the ONE THING you cannot afford, or buy, no matter how rich you are. TIME.

Read this next part very carefully and let it sink in.

There is no such thing as a dedicated workstation any more. It all fell into the HEDT envelope years ago when they started pushing server and workstation gear onto wealthy gamers. As such only the server market now remains independent.

Don't believe me? why are there no Epyc in gaming machines? why is there a Ryzen 3990 that goes into a board covered in dragons and crap? why do they have RGB when they are for pro use?

They are not. They are for PC gamers and workstation HEDT owners. The people that NEED HEDT.

As such, we now have a good few GPUs that are only necessary for HEDT.

If you want to whine about Nvidia's prosumer cards then have a good whine about Threadripper whilst you are at it. And 8 RAM slot monster boards you can't afford either. You could use all of that for gaming you know? would you need it? no.
 
Sxjn83I.jpg


And the first disappointment drops and derails the hype train.

"Up to four times the RT performance !" yeah, right. It can't even manage twice as much using RT AND DLSS.

Mooooooooooooooooooooooo !
 
And the first disappointment drops and derails the hype train.

"Up to four times the RT performance !" yeah, right. It can't even manage twice as much using RT AND DLSS.

Mooooooooooooooooooooooo !

TBF that was a rumoured spread by someone on Twitter IIRC.
 
Sxjn83I.jpg


And the first disappointment drops and derails the hype train.

"Up to four times the RT performance !" yeah, right. It can't even manage twice as much using RT AND DLSS.

Mooooooooooooooooooooooo !
That sounds pretty reasonable actually, since it seems to negate the performance hit from RTX.


However, the text seems a bit iffy - see how close Wolfenstein is to graphics card models, and compare it to Control, where there's plenty of space.
 
Never trust anything about performance that comes straight from NVidia, the reality is almost always very different.
 
Never trust anything about performance that comes straight from NVidia, the reality is almost always very different.
They're always accurate, but cover carefully picked best case scenarios.

So they're fairly misleading in regards to real world performance, but can still aid in figuring out the big picture.

However, I don't think that's from Nvidia due to the text placement - very easy mistake to do in Photoshop, but almost never happens with companies' slideshow templates and whatnot.
 
They're always accurate, but cover carefully picked best case scenarios.

So they're fairly misleading in regards to real world performance, but can still aid in figuring out the big picture.

However, I don't think that's from Nvidia due to the text placement - very easy mistake to do in Photoshop, but almost never happens with companies' slideshow templates and whatnot.

They don't help at all. Turing was crap at RT and it was never hidden. A game will totally collapse in performance with RT enabled. Had it stayed that way it would have had no future at all, especially when it looks weird. Why would you give in one hand just to have something taken from the other?

So to me? twice the performance is nothing special. That means 60 FPS instead of 30, big whoopee. Running a tech I won't run because I think it looks worse than without it.

Now before any one challenges me and I have to make another post let me post this.

Ark

azbJAd3.jpg


Control

ymUx0re.jpg


cl2N3Gf.jpg


At first I thought it was just me seeing these anomalies so I tried it on my other PC. Same. Exactly the same. I figured it may be the loss of resolution when running RT but then someone posted this today on OCUK, running 4k with no DLSS.

xT9QPac.jpg


Look around her head it is gritty, as is her hair shape and her hair. There are gritty lines on both sides of the wall. There are two gritty odd lines in the green circle radiating from her leg.

There are handles floating in the air.

That is what RT does to an image. That is without DLSS to boost the FPS.

I find that distracting. Especially when before RT 4k games with a tiny amount of alias took ALL of that away. Now RT puts it back for a few reflections and some lighting? you can keep it. It's all too distracting for me.

All that glistens isn't gold comes to mind.

So they can shove their best case RT performance scenario, and if that chart is indeed real then they are hiding something.

Maybe they are hiding just why they are launching a 3090 and why they are so worried about AMD.
 
Back
Top