Nvidia officially reveals their RTX 2060 Graphics Card

"better performance than the 1080 Ti" he said lol!! I must have transfered universes in my sleep... Such marketing. I'm always iffy towards his keynotes.
 
Last edited:
"better performance than the 1080 Ti" he said lol!! I must have transfered universes in my sleep... Such marketing. I'm always iffy towards his keynotes.

Yeah, somebody needs to follow him with a speech bubble that says (in specific use cases).

I remember him comparing it to a GTX 1070 Ti, but I don't remember the GTX 1080 Ti comparison.
 
Yeah, somebody needs to follow him with a speech bubble that says (in specific use cases).

I remember him comparing it to a GTX 1070 Ti, but I don't remember the GTX 1080 Ti comparison.


He compared it with 1070 Ti for sure, but i think that he also said that one of announced 2060 laptops (acer one?) is faster then desktop 1080. I'm not 100% sure to be honest.
 
This actually isn't a bad card. I mean, it's still too expensive but it at least isn't as bad as the rest of them and has DLSS and RT capabilities however limited that may be. Why would you pay £500-600 compared to £350-450 for only slightly better performance? You can overclock the RTX 2060 to be as fast as for all intents and purposes as an RTX 2070.
 
It'll be better than a 1080 a large portion of the time once DLSS starts getting used more. But banking on new technology to benefit other new technology isn't the smartest thing in the world so probably still not worth upgrading.

I'll probably still wait for Turing 2.0 or whatever. Hopefully AMD releases something special that's at least 1080ti/2080 level performance.
 
It's priced that way only because AMD doesn't have a horse in the race. A 1440p card is not worth $350 in 2019.
 
About the only thing I hate about this card? No Nvlink which could have given it a fighting chance in RT based games. I suspect the 1160 will be better value.
 
About the only thing I hate about this card? No Nvlink which could have given it a fighting chance in RT based games. I suspect the 1160 will be better value.

It does leave me confused why.. Unless performance vs value would make a 2060 SLI more worth while than a 2070 single GPU or maybe even 2080?

I would have thought Nvidia would like to entice people to double up on their cards.
 
It does leave me confused why.. Unless performance vs value would make a 2060 SLI more worth while than a 2070 single GPU or maybe even 2080?

I would have thought Nvidia would like to entice people to double up on their cards.

Yeah but they are assuming that you will buy a 1080 or better for 3X more money.

It's daft.

Ed. 2080 ffs not 1080.
 
Last edited:
NVidia hasn't enabled SLI on lower end cards for a while, I've had a few rants about the issues (And the illusion of better performance) of low end cards/low single card frame rates in dual configs & frame/input latency recently so I won't go in to that but that's the technical justification for keeping SLI to high end products(It in turn improves consumer perception and reduces the amount of work NVidia has to do with support & drivers).

Secondly, to be fair, NVLink is a fairly expensive and wide protocol with expensive links meant mostly for enterprise environments and adding that functionality to TU106 silicon likely would have driven the cost up further(If you look at Quadro cards based on TU106, as well as the 2070 ofc, they also lack NVLink, suggesting it's an inherent limitation of the silicon).

But we all know there's more to it than that, if you pair two 2070's you get a CUDA count and theoretical performance beyond that of a 2080Ti, which should cost significantly more, similarly two 2060's would have 1000 cores more than a single 2080, so keeping NVLink off their actual bread & butter devices reduces the risk of them cannibalising their high margin enthusiast products with their lower margin mainstream products already in higher demand. The mark up will generally rise exponentially with high margin products proportional to their cost of production (In any industry) because of the increased risk, upfront costs, inventory value ect so that's a fairly important thing for them to want to avoid given their current risky situation with excessive inventory value.
 
Last edited:
Currently a single 2060 is plenty for 1080p/1440p, for 4K 6Gb VRAM isn't exactly ideal. 2060 SLI serves no real purpose in gaming, outside high refresh rate 1440p but then you'd ideally want a single powerful GPU for better frame pacing.
In the future the extra computational oomph might come in handy at lower resolutions, but at that point the next gen cards are the better option.
I might be overly critical since my 760 SLI setup was... less than stellar. :D
 
Currently a single 2060 is plenty for 1080p/1440p, for 4K 6Gb VRAM isn't exactly ideal. 2060 SLI serves no real purpose in gaming, outside high refresh rate 1440p but then you'd ideally want a single powerful GPU for better frame pacing.
In the future the extra computational oomph might come in handy at lower resolutions, but at that point the next gen cards are the better option.
I might be overly critical since my 760 SLI setup was... less than stellar. :D

Think from the POV of an twitch gamer who lacks funds though. 2060Sli would have been a perfect solution for them given that high fps+low input lag > everything.
 
I don't see how a single 2060 wouldn't work just fine for streaming any of the popular games outside 4K. Due to the decent encoder they could get away with a weakish CPU as well.
 
SLI setups don't improve input lag over single cards, an SLI setup has the same latency as a single card setup (Same card) running the same task(Presumably the single card runs at around half the frame rate of the SLI setup though), or double the input latency of a single card setup (Twice as powerful single card) at the same frame rate. The time to render a frame on each card remains constant, meaning a perfect SLI setup technically always has 1 frame of lag.

The time between frames obviously halves if you double the frame rate, but the time taken to render frames (The contributing factor to input lag) only halves if you double the frame rate by producing frames in half the time (Obviously), which SLI doesn't do, it takes two cards producing frames at half the rate they're displayed by rendering them alternately.
 
Last edited:
The time between frames obviously halves if you double the frame rate, but the time taken to render frames (The contributing factor to input lag) only halves if you double the frame rate by producing frames in half the time (Obviously), which SLI doesn't do, it takes two cards producing frames at half the rate they're displayed by rendering them alternately.
Huh, never thought about it that way but it's pretty obvious when you point it out.


Though there's a solution, split frame rendering! So you can solve input lag and bring in thousands of other problems. :lol:
 
I decided to (procrastinate) by drawing a diagram if it helps anyone.
de2779cdf0437f33b07b8d839b7c4ffd.png

[EDIT: Those calculations are missing a milli/10^-3 with the frame timings IE: 1/(16.7*10^-3) = 60]

You get a similar phenomenon whenever you attempt to gain throughput increases through parallelisation alone regardless of the technology/application. If you had an Ethernet cable transmitting packets at say 1000 times per second, adding two of them would double the amount of packets(data) you can get per second, but the latency (Time to destination) of each packet remains the same.
 
Last edited:
Well well... Seems Gamersnexus bench marked this card with Battlefield V at 1080p using RTX OFF/low/high and the results as far as FPS go were 104, 66 and 55 FPS on average respectively.

For a 2060 card, that is a pretty good performer in my eyes,
 
Well well... Seems Gamersnexus bench marked this card with Battlefield V at 1080p using RTX OFF/low/high and the results as far as FPS go were 104, 66 and 55 FPS on average respectively.

For a 2060 card, that is a pretty good performer in my eyes,

The problem is the bottom line, which for all of the reviewers was "We would rather play BFV with RTX off for better visual quality and much higher FPS for competitive play" (which most people play BFV for).

BFV @ 1080p @ medium settings with RTX on = 60 FPS or worse really doesn't bake my cake.

It is a good performer with RTX off but it begs the question why? why pay £100 for RTX (which you are) if you don't use it?

The 1160 is a much better proposition, IMO. 1070 or better performance for last gen's 1060 price (IE £249) would be corking. Not £350+ for this. The 1070 has been easily available second hand lately for £250-£280.
 
Back
Top