Nvidia is reportedly working on a GTX 1660 Ti/GTX 1160

OR both sources are true and nvidia is truely working on a 1160 AND 1660 Ti. Definitely somewhat confusing, but I like the idea of not having to spend huge amounts of dosh for raytracing when I don't necessarily need it.

Is DLSS also done by the RT-Cores or could that/those card/s be supporting DLSS as well? Then this might be a nice addition.
 
I think the naming scheme is fine, tbh. I just hope these are not totally derped and over priced but I don't think my hope will carry any weight. I still reckon they'll be derped and expensive.
 
Nvidia has a huge supply of gtx 1060's 3gb and 6gb unsold. They need to get rid of them before opting to create a 2060. The creation of 2060 would absolutely destroy their movement to sell all the remaining gtx 1060.

I would NEVER buy this cards unless it's a bargain (which talking about nvidia, will never occur).

Nvidia just wants to scamm people.
 
OR both sources are true and nvidia is truely working on a 1160 AND 1660 Ti. Definitely somewhat confusing, but I like the idea of not having to spend huge amounts of dosh for raytracing when I don't necessarily need it.

Is DLSS also done by the RT-Cores or could that/those card/s be supporting DLSS as well? Then this might be a nice addition.

DLSS is done with Tensor cores, not the RT cores.
 
I wouldn't be surprised if remaining 1060 inventory got a 1160 rebrand/reconfiguration possibly like the 600 series -> 700 series changes. Assuming the RTX2060 is the lower binned version of TU106(given the 2070's a full/perfect implementation), I guess it's also possible they'd want to create an RT-less version of TU106 to get the same traditional shader performance with minimal die space(And therefore cost) and power use, which will presumably eventually also come in both "perfect" and "imperfect" bins, which could be the 1660 Ti with an eventual non-Ti (Using the rumoured TU116 die).
 
Last edited:
I wouldn't be surprised if remaining 1060 inventory got a 1160 rebrand/reconfiguration possibly like the 600 series -> 700 series changes. Assuming the RTX2060 is the lower binned version of TU106(given the 2070's a full/perfect implementation), I guess it's also possible they'd want to create an RT-less version of TU106 to get the same traditional shader performance with minimal die space(And therefore cost) and power use, which will presumably eventually also come in both "perfect" and "imperfect" bins, which could be the 1660 Ti with an eventual non-Ti (Using the rumoured TU116 die).

Or maybe the GTX 1070 will become the GTX 1160? The GTX 770 was successful and was just a more refined GTX 680.
 
Afaik 1070's are in the "Once they're gone they're gone" stages with no major build up of old inventory unlike the 1060 parts, plus it'd probably compete too directly with TU106 and the possible TU116 cut down bins.
 
AMD managed to develop a whole new series of cards by printing loads of labels with the number 3 on them, all Nvidia need to do is replace the second 0 with a 1.
 
It's a shame they didn't release a 2070, 2080 and 2080 Ti GTX versions with no Tensor cores or RT cores for those of us that really don't want either and just wanted the raw horse power of more cores and a newer architecture, This way they could still sell RT cards and non RT cards, Double dip the market so to speak.
 
AMD managed to develop a whole new series of cards by printing loads of labels with the number 3 on them, all Nvidia need to do is replace the second 0 with a 1.
Not every 300 series card was a refresh tbf (It was Tonga's proper desktop debut, unless you're counting the OEM 285).

The reason why NVidia arn't making RT/Tensor-less dies for the high end 2000 parts is likely because the extra design work & cost of taping out and manufacturing a new line isn't worth it for what would essentially be slightly faster GTX1070/80/Ti's that won't support many upcoming industry standards. Even the TU116 die is rumoured to keep the Tensor cores, which does make a lot of sense for future performance. Plus, they're probably more concerned with their 7nm designs atm. Technically their largest & most expensive card is still RT-less but that's not really a gaming card(As well as everything they sell under the £400 mark, IE pretty much all of the market in terms of user share) .
 
Not every 300 series card was a refresh tbf (It was Tonga's proper desktop debut, unless you're counting the OEM 285).

The reason why NVidia arn't making RT/Tensor-less dies for the high end 2000 parts is likely because the extra design work & cost of taping out and manufacturing a new line isn't worth it for what would essentially be slightly faster GTX1070/80/Ti's that won't support many upcoming industry standards. Even the TU116 die is rumoured to keep the Tensor cores, which does make a lot of sense for future performance. Plus, they're probably more concerned with their 7nm designs atm. Technically their largest & most expensive card is still RT-less but that's not really a gaming card(As well as everything they sell under the £400 mark, IE pretty much all of the market in terms of user share) .

Yeah, I agree. I think Turing is more of a stop-gap, a way to introduce ray-tracing and DLSS to the market. Less of a, 'this is the best of what we have' and more of a, 'this is just a taste of what's to come'. And the prices are so high because there's no competition, the parts are massive and expensive to manufacture, and they wanted to offload Pascal stock.
 
Not every 300 series card was a refresh tbf (It was Tonga's proper desktop debut, unless you're counting the OEM 285).

The reason why NVidia arn't making RT/Tensor-less dies for the high end 2000 parts is likely because the extra design work & cost of taping out and manufacturing a new line isn't worth it for what would essentially be slightly faster GTX1070/80/Ti's that won't support many upcoming industry standards. Even the TU116 die is rumoured to keep the Tensor cores, which does make a lot of sense for future performance. Plus, they're probably more concerned with their 7nm designs atm. Technically their largest & most expensive card is still RT-less but that's not really a gaming card(As well as everything they sell under the £400 mark, IE pretty much all of the market in terms of user share) .

The reason why NVidia are not making RT/Tensor less dies for the high end is they are afraid that they will sell in large volumes at the expense of the more expensive cards with the features enabled.
 
They wouldn't be able to just fuse off the RT cores as if they were CPU cores or something, they're already integrated within the SM so you either design them with and have them or without and don't. Spending more money creating a new design that directly competes with your old inventory clearances, will never be able to match on price and doesn't offer tangible benefits to performance, while competing with the cards you've put on market specifically to get the new wave of industry standard APIs into full swing, is not just a shot in the foot but likely damaging to the PC gaming ecosystem in the long term.
 
They wouldn't be able to just fuse off the RT cores as if they were CPU cores or something, they're already integrated within the SM so you either design them with and have them or without and don't. Spending more money creating a new design that directly competes with your old inventory clearances, will never be able to match on price and doesn't offer tangible benefits to performance, while competing with the cards you've put on market specifically to get the new wave of industry standard APIs into full swing, is not just a shot in the foot but likely damaging to the PC gaming ecosystem in the long term.

It would have been dead easy for NVidia to have taken the decision when designing the 20XX series chips to produce dies with and without RT/Tensor cores, yes it would mean having two separate designs but if planned would not have been a big deal.

What NVidia are doing forcing people to buy these features whether they want them or not and charging a small fortune for it is absolutely disgusting. People will remember this when AMD and intel offer serious competition as all NVidia have done with their greed and stupidity is make enemies.
 
What NVidia are doing forcing people to buy these features whether they want them or not and charging a small fortune for it is absolutely disgusting. People will remember this when AMD and intel offer serious competition as all NVidia have done with their greed and stupidity is make enemies.
NVidia isn't forcing anyone to do anything, if you make the choice to buy a high end GPU but don't want to pay the premium for RT then buy a 1000 series card, they are basically what you're asking for, given outside of the addition of Tensor & RT units the shaders themselves haven't really changed meaningfully between Pascal/Volta/Turing.

People will remember this when AMD & Intel roll out their MI/RT compatible GPUs with their next generation as leading the vanguard & laying the foundations for all GPU vendors & game companies when it comes to the implementation of neural network accelerated graphics pipelines and raytraced lighting. A mature but light RT implementation should in theory perform better than a pure shader based one as developers find better ways to balance the workloads concurrently & make use of the whole die simultaneously without one portion bottlenecking the other(Expect this next gen).

The request for high end GPUs that tackle performance issues by just throwing more of the same cores/power at the issue might seem desirable in the short term, but the industry, like every other form of processor ever, can't continue on that path, and at some point someone had to bite the bullet and start the shift towards smarter, more efficient, more accurate pipelines(Even it means a lack of speed up in most traditional/legacy code(though not really an issue for a high end GPU today which can generally max out most games), with the fruits of the labour not realised for months to years). Whichever company had stepped out to make that move first would have encountered exactly the same chicken and egg problem that Turing is attempting to deal with.

If it weren't for changes/extensions to x86 for the addition of all sorts of new units & instructions to the architectures that didn't offer any benefit to legacy code and required recompiles/rewrites to gain benefits then the CPU industry would have ground to a halt in the early 90's.
 
Last edited:
NVidia isn't forcing anyone to do anything, if you make the choice to buy a high end GPU but don't want to pay the premium for RT then buy a 1000 series card, they are basically what you're asking for, given outside of the addition of Tensor & RT units the shaders themselves haven't really changed meaningfully between Pascal/Volta/Turing.

That would be all well and good, if there were still a plentiful supply of 10 series cards. The 1080Ti is now pretty much gone, as is the 1080.
 
Back
Top