MSI Geforce RTX 2080 Ti and RTX 2080 Gaming X TRIO pictured

So SLI isn't dead, it's just been revamped. Interesting.

SLI is dead. The name, the way it worked and etc. I would be amazed if Nvidia still call it SLI, but I suppose we will see.

What I find more interesting is the fact that there may be a 2080Ti already lined up. Are they going to launch the entire range this time all at once? or get all of the "enthusiasts" to buy a 2080, then brain wash them into a 2080Ti at a later date?

Also, let me say this now, before they launch a crumb... This is speculation mixed with opinion and some fact...

DO NOT fall for the Ray Tracing marketing (this is not aimed at you Bart, but every one). From what you can tell there is one game coming out for it even remotely soon (Metro) and it won't make ANY difference to the gameplay. It will be years before we see proper fully ray traced games, and by the time we do your 2080/Ti will be woefully outdated.

Think of RT now like the Physx PPU. So early games will be quite basic (like Mirror's Edge) but will have a cool factor. However, fast forward two years? that PPU could not even run the next Physx title. Believe me I tried.

It required much newer hardware by the time Mafia 2 came out, so those who paid £200 or so just to play Mirror's Edge did just that.

If you think a £500 price premium is worth it for Metro? go right ahead. If not? don't bother. The 10 series are perfectly capable of running any game on the market now and in the foreseeable because they are all cross coded console games and will remain so. If Nvidia had gotten their GPUs into consoles? then yes, we would likely see a RT version of the consoles but we won't now.

Nvidia have nothing now. Nothing worth anything, so they are going to be pushing RT has hard as they can. However, it is 99% a sales pitch designed to detach you from your cash, and nothing else. It will be years before we see properly RT games.
 
SLI is dead. The name, the way it worked and etc. I would be amazed if Nvidia still call it SLI, but I suppose we will see.

What I find more interesting is the fact that there may be a 2080Ti already lined up. Are they going to launch the entire range this time all at once? or get all of the "enthusiasts" to buy a 2080, then brain wash them into a 2080Ti at a later date?

Also, let me say this now, before they launch a crumb... This is speculation mixed with opinion and some fact...

DO NOT fall for the Ray Tracing marketing (this is not aimed at you Bart, but every one). From what you can tell there is one game coming out for it even remotely soon (Metro) and it won't make ANY difference to the gameplay. It will be years before we see proper fully ray traced games, and by the time we do your 2080/Ti will be woefully outdated.

Think of RT now like the Physx PPU. So early games will be quite basic (like Mirror's Edge) but will have a cool factor. However, fast forward two years? that PPU could not even run the next Physx title. Believe me I tried.

It required much newer hardware by the time Mafia 2 came out, so those who paid £200 or so just to play Mirror's Edge did just that.

If you think a £500 price premium is worth it for Metro? go right ahead. If not? don't bother. The 10 series are perfectly capable of running any game on the market now and in the foreseeable because they are all cross coded console games and will remain so. If Nvidia had gotten their GPUs into consoles? then yes, we would likely see a RT version of the consoles but we won't now.

Nvidia have nothing now. Nothing worth anything, so they are going to be pushing RT has hard as they can. However, it is 99% a sales pitch designed to detach you from your cash, and nothing else. It will be years before we see properly RT games.

I registered only to say that you won the internet for today. But unfortunately people will still buy it just to run those RT benchmarks... I have now a 1060 6gb, and I'm only waiting for the prices of the 1080 to come down to upgrade.
 
I registered only to say that you won the internet for today. But unfortunately people will still buy it just to run those RT benchmarks... I have now a 1060 6gb, and I'm only waiting for the prices of the 1080 to come down to upgrade.

Just watching a video, and this guy reckons that the 2080Ti is cut down. So basically a Ti on mid range silicon.

https://www.youtube.com/watch?v=ms7HQ7rckpA

If that is true Nvidia are rinsing it for everything they have. It's also been pointed out in the comments that RT won't be a thing for a long while yet but will any one listen? doubt it. Highly doubt it.

And 2080 8% faster than a 1080Ti for more money than a 1080Ti and probably based on Founders 1080.

My Titan XP should be more than safe then, given it has always been 5% min faster than any 1080Ti I have ever encountered (whether it be under air or water) and it itself is under water (my TXP).

So I need absolutely nothing for at least another two years then, as I suspected.
 
2080 and 2080 Ti ? seems odd they'd release the Ti model at the same time, Guess that means they'll be releasing something else new in around 6-9 months time if we're going to have the Ti so soon.
 
Just watching a video, and this guy reckons that the 2080Ti is cut down. So basically a Ti on mid range silicon.

https://www.youtube.com/watch?v=ms7HQ7rckpA

If that is true Nvidia are rinsing it for everything they have. It's also been pointed out in the comments that RT won't be a thing for a long while yet but will any one listen? doubt it. Highly doubt it.

And 2080 8% faster than a 1080Ti for more money than a 1080Ti and probably based on Founders 1080.

My Titan XP should be more than safe then, given it has always been 5% min faster than any 1080Ti I have ever encountered (whether it be under air or water) and it itself is under water (my TXP).

So I need absolutely nothing for at least another two years then, as I suspected.

And here I am gaming quite contently on my 4790k and Asus Strix 1070 at 1440p :P

Still not going to upgrade, there is just no point for me to. Everything you stated is exactly what I believe on the Ray Tracing aspect of the cards, but the other main reason is my CPU is still more then enough grunt power to get me the FPS I want in any of the games I play....the only CPU that would be better is an 8700k (not remotely enough gains to care swapping CPU / mobo / RAM) or an i9 series (ha! I'm not stupid, no way would i fork that cash out).
 
While RTRT will be very limited in its initial roll out(Every technology has to start with a single GPU series and a single game though), it will be almost ubiquitous in much shorter order than technology like PhysX(And should at all be compared to it).

RTRT is a cross-vendor(AMD, NVidia and Intel will all almost certainly have baked-in hardware support for DXR and Vulkan's RT API with the former two already having software support on their current mainstream GPUs and the latter famously having thrown billions at trying to get an RTRT GPU to market in the past) technology that when used correctly will finally allow the removal of one of the biggest issues of uncanny valley(Subsurface tonemapping) as well as making many current rendering techniques obsolete(As many techniques are needlessly expensive using pure rasterisation-orientated hardware), while realtime refractions and many other elements of our world will finally be representable in realtime videogames for the first time ever.

RTRT hardware is not meant exclusively for RTRT rendering in itself, the mathematical processes used in RTRT and its denoising process is useful for a wide variety of tasks(Just as traditional GPUs are) and regardless of how long it takes RT rendering to catch on the hardware itself is incredibly versatile and will likely be used for many many things we havn't yet considered.
 
Last edited:
SLI is dead. The name, the way it worked and etc. I would be amazed if Nvidia still call it SLI, but I suppose we will see.

What I find more interesting is the fact that there may be a 2080Ti already lined up. Are they going to launch the entire range this time all at once? or get all of the "enthusiasts" to buy a 2080, then brain wash them into a 2080Ti at a later date?

Also, let me say this now, before they launch a crumb... This is speculation mixed with opinion and some fact...

DO NOT fall for the Ray Tracing marketing (this is not aimed at you Bart, but every one). From what you can tell there is one game coming out for it even remotely soon (Metro) and it won't make ANY difference to the gameplay. It will be years before we see proper fully ray traced games, and by the time we do your 2080/Ti will be woefully outdated.

Think of RT now like the Physx PPU. So early games will be quite basic (like Mirror's Edge) but will have a cool factor. However, fast forward two years? that PPU could not even run the next Physx title. Believe me I tried.

It required much newer hardware by the time Mafia 2 came out, so those who paid £200 or so just to play Mirror's Edge did just that.

If you think a £500 price premium is worth it for Metro? go right ahead. If not? don't bother. The 10 series are perfectly capable of running any game on the market now and in the foreseeable because they are all cross coded console games and will remain so. If Nvidia had gotten their GPUs into consoles? then yes, we would likely see a RT version of the consoles but we won't now.

Nvidia have nothing now. Nothing worth anything, so they are going to be pushing RT has hard as they can. However, it is 99% a sales pitch designed to detach you from your cash, and nothing else. It will be years before we see properly RT games.

Agreed with every letter dude. :) I already see some morons (on THIS forum sadly) falling for the RT garbage. Fools everywhere believe marketing and don't look at history. Nvidia are NOT innovators. All they've done is keep adding cores to the same old crap, dress it up in a new marketing dress, then add some "features" that they only PRAY developers are dumb enough to use. Add some VRAM with every gen, and some power improvements, and hope people keep believing that the NEW stuff is SOOO much faster than the OLD stuff.
 
2080 and 2080 Ti ? seems odd they'd release the Ti model at the same time, Guess that means they'll be releasing something else new in around 6-9 months time if we're going to have the Ti so soon.


More likely is that the 2080Ti was what the 2080 was originally going to be but Nvidia wanted more money.
 
2080 and 2080 Ti ? seems odd they'd release the Ti model at the same time, Guess that means they'll be releasing something else new in around 6-9 months time if we're going to have the Ti so soon.

From the sound of it they are set to release a whole range on Turing (cut down Volta I would assume) and *then* another entire range based on the whole thing maybe.

And there was me wondering how Nvidia could possibly continue to keep people buying into the future, given the GPU market is completely stale. If that is true then they will have more than enough products made from cutting up others to last them a good while yet.

As I said though, it's all RT marketing. They tried this carrot a few times now, and no one has bitten. Pretty brave of them to basically bet everything on RT, even though they are going to have to proper brown nose up every single dev to try to get them to use it in their games.

LOL, all we really need to do is look at other tech they've tried this with over the years to see where it's heading... CUDA cores, yup. DP. Erm, yeah. Physx, dead and buried. SLi dead. 4k, boring.

Now I can see why they pulled DP out of Maxwell and kept it hidden. It all suddenly makes perfect sense. Rip something out of your cards, cut them back to mid range and then hold onto it for the future.

Also, please read this. It's quite important !

Please do watch the Adored video. I know he goes on and on sometimes and it does get old, but there is a section of that video that basically tells the tale..

OK, so in it he shows this (actually very good, but sadly flawed as of typing this) technology where Nvidia can "de-noise" or "despeckle" RT very quickly by cutting corners. What I mean is, they can basically use some clever tricks to despeckle or de-noise RT scenes quite quickly** However, as he demonstrates there is still lag, even when you use this trick. The main RT scene without it? takes ages to render. Nowhere near fast enough.

However, here is my analysis. Firstly, any demo Nvidia show you of RT will be using four Quadros linked together. I am beginning to suspect (though could be wrong) that this new link tech they have is their scalable architecture, where you can basically connect GPUs together but more like Infinity Fabric than any SLI. So theoretically you connect two together and you end up with double the amount of CUDA, double the "Tensor" cores and etc.

Now if that is how it works (and it should, they promised scaleable right around this time) then think about this... If they are using four Quadros "linked" (however that linking works, and they are, because they have demoed it already using that system) and they can not render the scene, even with this trick, fast enough to keep up with some one moving it around, then exactly what hope will one cut down GPU that is nowhere near as quick as one of these Quadros have of ever rendering RT full scene fast enough?

Please do take some time to think about that.

** as I said, four Quadros, even using this trick, are not fast enough.

It will be many, many years before we see fully rendered RT scenes even using this (quite amazing TBF) trick.

So RT and indeed RTX? just made to part people with their cash. To make others feel inferior ("Oh you only have GTX?") and so on. However, if you do some pretty deep digging you soon realise that RT may as well stand for "Right Tw*t" because that is what you will be if you pay for it now.

I've seen the RT in Metro, and it was just a table lamp with a glow around it. They have a snowflake's chance in hell of making that game run fully in RT with the tech they have now, and lest we not forget they came out and said (the people making Metro) that it was basically a GPU cooker.

That doesn't exactly fill me with confidence.
 
Last edited:
I'm so glad I haven't ordered my new rig yet. Wait to see when release window and price for the ti before I make decision. All i know is i want 2 of them if z370 boards support this new version of sli.
 
From the sound of it they are set to release a whole range on Turing (cut down Volta I would assume) and *then* another entire range based on the whole thing maybe.

And there was me wondering how Nvidia could possibly continue to keep people buying into the future, given the GPU market is completely stale. If that is true then they will have more than enough products made from cutting up others to last them a good while yet.

As I said though, it's all RT marketing. They tried this carrot a few times now, and no one has bitten. Pretty brave of them to basically bet everything on RT, even though they are going to have to proper brown nose up every single dev to try to get them to use it in their games.

LOL, all we really need to do is look at other tech they've tried this with over the years to see where it's heading... CUDA cores, yup. DP. Erm, yeah. Physx, dead and buried. SLi dead. 4k, boring.

Now I can see why they pulled DP out of Maxwell and kept it hidden. It all suddenly makes perfect sense. Rip something out of your cards, cut them back to mid range and then hold onto it for the future.


You forgot their insane push for 3D GAMING, yeah that was a thing to.


I kinda want to upgrade from my 970 but i still don't think the new series will be good enough in terms of price, and all this hype stuff doesn't exactly help pricing either, it will increase demand on launch and knowing how crafty some shops are, they will keep that inflated launch price.


Gpu's are so boring now.
 
Gpu's are so boring now.

Yup that is 100% correct. So what do Nvidia do? try and make it exciting with all of their marketing by using RT. A tech still many years away. And they are going to try and sell you the next decade's worth of GPUs with that carrot.

A carrot Jen can, IMO, shove up his little arse.

If this were something universal (more chance of pigs flying) and applied to all gaming across the entire gaming spectrum? then fair play, even in this embryonic stage I would be quite excited. But it isn't, is it?

It's just a re-run of history. The first Metro game used Physx and basically shamed any rig it was run on. Problem was the original wasn't even a very good game. It was then used for many years as a benchmark, and it was many years before you could max it out.

All sound familar? good, because that is what is happening now. Another proprietary tech for half of the PC Gaming market (not as in numbers, but choice) that only runs on a PC and needs full support to do so.

Now let's swish that around in out collective mouths for a minute.... How many games have we seen since Crysis that are *only* for PC because consoles simply could not run them? *puts on Elvis Presley voice* "A thankyou verymuch" none.

Why? because any one that does so loses many millions in sales. It would take some one incredibly brave, and no one is brave enough because they are greedy.

So, here is what will happen. Over the next few years we will see about two games that use RT, and they will be barely any better than the game in its regular form, which btw? again will hinder RT development, meaning they have to do loads of stuff twice (because us lower class poggers stuck on GTX and eating bogeys will not be able to render that lamp using RT).

Sound like a lot of work? good. Do devs like more work? lmao, no.

I will be amazed if this tech ever gets a foothold. Not because it isn't good (because it is, genuinely amazing) but because it needs support and only works on one GPU type in one market.
 
The amount of ignorance in this thread is overwhelming. Referring to this technology as proprietary, NVidia exclusive, an NVidia first, ect is flat out incorrect. We're discussing the oldest and arguably most widely used rendering technique in the book and realtime applications of the technology are built on decades of public and cross vendor research and development. NVidia's RTX technology is a hardware implementation for supporting a cross-vendor API developed through collaboration of AMD, NVidia, Intel and Microsoft and will almost certainly be the major selling point in the next generation of game consoles(Allowing a generational leap in graphics with a relatively low hardware cost). This technology WILL be ubiquitous across GPUs and game consoles with the next generation of either, the wheels for this have been in motion for a long time and it is the result of rarely-seen collaboration between many parties in the industry.

RT techniques significantly reduce the developer workload for a wide variety of applications and due to the significant cost and time savings these techniques enable it means next generation games will almost certainly rely entirely on these and revert to a software implementation of the RT pipeline if no hardware acceleration is available.

It's worth noting that the RTRT hardware required for many of these effects is a tiny fraction of whats required to render a whole seen(Which can be done with a single RTX Quaddro as opposed to the 4 required for Volta based rendering), meaning even the smallest degrees of RTRT acceleration would likely be valuable for next-gen gaming.
 
I don't think Nvidia are forcing RT down our throats with shiny new cards knowing it has no future, or even a future that's years down the line and will only be adopted by force of hand against the grain of consoles, just to sell cards when they could sell a Pascal refresh all the same. I know very litte about Ray Tracing, and I don't want to be the sheep that follows the guy with the bigger words, but what tgrech makes sense to me. AMD have their own RT technology, and it makes sense to use that for consoles since it's a technology that could seriously push graphics to the next level. In many ways, graphics have slowed down dramatically over the last ten years. Because of that, no matter how hard RT will be to implement across the board or how hard it will be to run, the market is clamouring for something new, and it looks like Nvidia want to be first to the punch. If it's also 'scaleable', that makes sense as well.
 
Agreed with every letter dude. :) I already see some morons (on THIS forum sadly) falling for the RT garbage. Fools everywhere believe marketing and don't look at history. Nvidia are NOT innovators. All they've done is keep adding cores to the same old crap, dress it up in a new marketing dress, then add some "features" that they only PRAY developers are dumb enough to use. Add some VRAM with every gen, and some power improvements, and hope people keep believing that the NEW stuff is SOOO much faster than the OLD stuff.

Calling us morons and fools. That's definitely the way to treat family.
 
From the sound of it they are set to release a whole range on Turing (cut down Volta I would assume) and *then* another entire range based on the whole thing maybe.

And there was me wondering how Nvidia could possibly continue to keep people buying into the future, given the GPU market is completely stale. If that is true then they will have more than enough products made from cutting up others to last them a good while yet.

As I said though, it's all RT marketing. They tried this carrot a few times now, and no one has bitten. Pretty brave of them to basically bet everything on RT, even though they are going to have to proper brown nose up every single dev to try to get them to use it in their games.

LOL, all we really need to do is look at other tech they've tried this with over the years to see where it's heading... CUDA cores, yup. DP. Erm, yeah. Physx, dead and buried. SLi dead. 4k, boring.

Now I can see why they pulled DP out of Maxwell and kept it hidden. It all suddenly makes perfect sense. Rip something out of your cards, cut them back to mid range and then hold onto it for the future.

Also, please read this. It's quite important !

Please do watch the Adored video. I know he goes on and on sometimes and it does get old, but there is a section of that video that basically tells the tale..

OK, so in it he shows this (actually very good, but sadly flawed as of typing this) technology where Nvidia can "de-noise" or "despeckle" RT very quickly by cutting corners. What I mean is, they can basically use some clever tricks to despeckle or de-noise RT scenes quite quickly** However, as he demonstrates there is still lag, even when you use this trick. The main RT scene without it? takes ages to render. Nowhere near fast enough.

However, here is my analysis. Firstly, any demo Nvidia show you of RT will be using four Quadros linked together. I am beginning to suspect (though could be wrong) that this new link tech they have is their scalable architecture, where you can basically connect GPUs together but more like Infinity Fabric than any SLI. So theoretically you connect two together and you end up with double the amount of CUDA, double the "Tensor" cores and etc.

Now if that is how it works (and it should, they promised scaleable right around this time) then think about this... If they are using four Quadros "linked" (however that linking works, and they are, because they have demoed it already using that system) and they can not render the scene, even with this trick, fast enough to keep up with some one moving it around, then exactly what hope will one cut down GPU that is nowhere near as quick as one of these Quadros have of ever rendering RT full scene fast enough?

Please do take some time to think about that.

** as I said, four Quadros, even using this trick, are not fast enough.

It will be many, many years before we see fully rendered RT scenes even using this (quite amazing TBF) trick.

So RT and indeed RTX? just made to part people with their cash. To make others feel inferior ("Oh you only have GTX?") and so on. However, if you do some pretty deep digging you soon realise that RT may as well stand for "Right Tw*t" because that is what you will be if you pay for it now.

I've seen the RT in Metro, and it was just a table lamp with a glow around it. They have a snowflake's chance in hell of making that game run fully in RT with the tech they have now, and lest we not forget they came out and said (the people making Metro) that it was basically a GPU cooker.

That doesn't exactly fill me with confidence.

Well whatever they bring out I'll be steering clear of it, I have 21 games for my PS4 Pro, Only played a few hours so far of a few, All exclusives, And I'm working 6 days a week for at least 12 hours a day so I have plenty to keep me busy and in November I'm going to Texas for 3 months and will hopefully get an extra 3 month extension so I'll stay for 6 months total so a new GPU is far far far far far far down my wants and needs list ^_^
 
Last edited:
I've seen the RT in Metro, and it was just a table lamp with a glow around it. They have a snowflake's chance in hell of making that game run fully in RT with the tech they have now, and lest we not forget they came out and said (the people making Metro) that it was basically a GPU cooker.

That doesn't exactly fill me with confidence.

This, exacly this is what I referred to earlier. They've done it with lost planet in the past, with DX10 I believe. I'm extremely skeptical and watching from afar to he honest.
 
The amount of ignorance in this thread is overwhelming.

It certainly is. If you read the posts on here you would understand how incredibly jaded many of us are with games in general and "gaming PCs" even more.

So you can understand our pessimism I assume.

Here we are, being sold a technology a year or two after some of us paid £1300 for a GPU, and being told that our GTX cards are basically old hat and useless going into the future. Oh but that's OK, all you need to do is spend another £1300 to fix that.

Yeah, right.
 
Back
Top