WYP
News Guru
Last edited:
So basically the more RTX DICE cut out of BFV the better it runs. Who'da thunk it eh?
I would worry that in the end the difference will be indiscernible.
I feel having RT on this card is just a wrong promise and thus a very bad idea.
1070ish performance for the same price the 1070 launched at.
Between 1070 Ti and 1080 doesn't sound like '1070ish performance' to me. A 1070 Ti is still considerably more expensive than $350 and will be slower. So not only will you be getting (assuming this leak is true) better performance than the 1070 Ti for a lower price, you're also getting new tech.
If the 1160 is a 2060 without RT and at a lower price, that would be even better value (assuming you don't want RT).
Going from the TFLOPs & clock speeds(Within 5% of 2070) of this FE against the RTX2070 FE, it seems like this is coming in at ~1960 cores(Around 15% less), so it'd make sense that RT performance(Which shouldn't have much of an impact from the much larger %age cut to memory+bandwidth) isn't too far from the RTX2070. This makes a lot of sense given the different between the other two Turing cards cut from the same die, the RTX2080Ti & Titan(Yields can't be that bad with such a mature node, especially as we get down to more sensible die sizes), as well as with keeping the Tensor & RT performance up while still differentiating in texture/shader performance with the bandwidth & memory reduction(Which nowadays is probably worth a lot more in terms of total board BOM cost).
Not putting my money on AMD in the foreseeable future, I'm hoping Intel can poke Nvidia in both eyes and make them their way![]()
A big part of the reason why AFR(The only multi-GPU technology you can easily and universally implement withoutt active support from devs) is kinda dead(Particularly for low end cards) is because people started to realise that frame latency was just as important to smoothness as frame rate. Two cards using AFR to get upto ~x1.9 frame rate boosts would generally have the same or worse frame latency as if only one of the cards was turned on. It's not really an issue if you can already get 60+fps on one card and you're using it to get to ~120+fps, but if you're using it for a game running at 20fps on one card to get to playable framerates then you're looking at ~50ms frame latencies, which is more of a delay than many modern internet connections bring in(And almost impossible to compensate for).
From what I have seen expect Polaris from Intel. Maybe faster, as it may be shrunk.
That's the fear alongside my hope.
Yeah from the sounds of it Intel's GPU is using similar concepts to Raja's AMD GPU's, it's looking like a primary compute focussed card with features to match(Large amounts of ECC HBM, AI/ML instructions, single slot cooler(From the silhouette they unveiled), likely sub 150W, possibly closer to 75W for the first models, so Polaris tier gaming performance probably isn't crazy for their first GPU. Presumably it's going to launch in the high memory configurations with all database/server features first to selected partners/customers, with some prosumer & then consumer versions(But still probably sold mostly for their compute/CAD kinda stuff first) following.
They are mid range dude. Definitely not high end.
They will also be massively over inflated prices, just for the Intel gang.
Did you really just ruin one of my last hopes in 2018?you've clout though so you're more than likely right. Which leaves Nvidia out of control (unless we stop buying their trinkets), including the 2060 price here.
I'd only buy into 1st gen RTX as a last resort; you know, my GPU would break and there's nothing else to replace it with. I am seriously noticing the lack of performance in modern games with my trusty card though xejete in for example odyssey I have to turn down most settings quite a bit - and it'll only get worse. Still, whereas normally that'd be an incentive for me to buy into the new, today it's not whatsoeverand likely we'll see better refreshes on 7nm in 2019 anyway. Zero lifespan for RTX 2000.