RTX 2060 Pricing and Performance Leaked

Intel doesn't want to or have much reason to compete with NVidia, it's AMD who's stealing all their high-margin marketshare with their server racks full of Vega's & Epycs. This is a GPU that will use exactly the same principles, and compete in the same sphere, as Fury & Vega(The latter being massively successful in its target arena and posing a continued threat to Intel heading into 2019 as 7nm Vega looks to further revolutionise AMD's Epyc server rack options, but a "flop" to gamers, which is why I expect gamers to percieve Intel's cards to be the same regardless of how well it compete in its primary markets) . It will undeniably be an expensive card given their targets & projected technology, HBM is absolutely critical for their goals, so it might get excellent perf/watt when gaming, but expect absolute performance to be in line with other single-slot 75W cards of the era while having a price tag closer to a 300W beast. Since this is primarily targeting use in racks of servers with tens of them, efficiency will be key, which means a fairly modest clock speed/performance target for an individual card, with the benefits of the architecture truly coming to play when they're stacked together in large number, much like Vega(A good chunk of Vega's die is committed to managing large asymmetric memory hierarchies for cases like this), while gaming famously doesn't lend itself well to such levels of paralellisation with current software approaches.

Rada Koduri's legacy at AMD was not relatively run of the mill consumer-focussed cards like Polaris using mature technologies, it was the use of progressive & expensive technologies ahead of their large scale debut like multi-die interposers to create high bandwidth compute monsters for what a few years ago were niche markets for GPUs but have now became the primary markets in terms of revenue. Polaris kept Radeon alive in the short term, but Fury & Vega sowed the seeds necessary for their long term survival & showed true skill at predicting, anticipating & preparing for future trends even in the face of consumer adversity(Not that consumers really ever know what's good for a company who primarily makes products for markets a million miles from the average consumers use case).
 
Last edited:
There is a long running feud between Intel and Nvidia. A pretty nasty one, too.

It all happened when Intel refused to give Nvidia any more socket licenses for motherboards. Something about them being arrogant, IDK.

I'd imagine it is them they are targeting. Honestly, why would you target AMD? their graphics division is only being kept alive by Ryzen.

Look up sales figures for the best series they ever released (The ATI 5000 series). They absolutely smacked Nvidia sideways. Yet the biggest selling card was the GTX 260 by a country mile.

Have a dig.. Honestly you would not want to own AMD's GPU dept. It's pretty awful.
 
Maybe so, but Intel isn't in a position to challenge NVidia's primary markets off the bat, and they already do compete in many of AMDs primary markets, Intel says the architecture for Xe will replace Gen11 in all their products, from CPU's iGPU's upto their discrete card(s), similar to GCN, Vega & Navi. If their top end Gen11 part has 64 exu's for about 1TFLOP of FP32 performance(And that'll likely be a fair chunk of the CPU die given how large 24 gen10 ex's is on current models), then I think it'd be reasonable to expect Xe to stay within 5 times that amount on an initial dedicated card with about 5-6TFlops FP32 tops(And probably not used amazingly well for gaming, at least not with initial drivers), though if the architecture scales well once yields improve there could be room to double that further on and keep on increasing.
 
Back
Top