AMD has Ray Tracing GPUs in Development

No doubt this will end up in consoles too. Hence why it may not be ready.

More likely the fact they are just behind the curve because they don't have the money to keep up with Nvidia. The fact Nvidia launched however means the software will be better once AMD comes out and it'll be a better transition for them than Nvidia.
 
In other words, We'll let Nvidia do all the early heavy lifting, paying developers the big bucks to incorporate ray tracing and if it becomes a success, we'll jump in with our own hardware at a lower cost. Thanks Nvidia! :D
 
Well, RTX started development around the same time the next console SoC designs were being planned out according to them, I think it's pretty obvious that the push for RTRT came from Microsoft &/or Sony and the hardware manufacturers reacted. NVidia has more R&D money & resources so they got something working out first, but from recent comments both have been working on RTRT hardware and software for as long as each other, Microsoft seemingly just as long. There's no way a single hardware manufacturer could develop both the hardware, software, and the paradigms and standards that define them, this was definitely a cross-industry collaboration in more aspects than one, I expect only hardware implementations and basic software drivers to be roughly unique.

I still hold NVidia made a sacrifice for both of them by moving first and getting RTRT hardware to PCs, and I think the fact Navi's design must have been finalised by the time Turing launched if it's coming this year more or less confirms that they weren't too out of step under the hood, one just has some other cards to play and markets to focus on first.
 
Last edited:
Well, RTX started development around the same time the next console SoC designs were being planned out according to them, I think it's pretty obvious that the push for RTRT came from Microsoft &/or Sony and the hardware manufacturers reacted. NVidia has more R&D money & resources so they got something working out first, but from recent comments both have been working on RTRT hardware and software for as long as each other, Microsoft seemingly just as long. There's no way a single hardware manufacturer could develop both the hardware, software, and the paradigms and standards that define them, this was definitely a cross-industry collaboration in more aspects than one, I expect only hardware implementations and basic software drivers to be roughly unique.

I still hold NVidia made a sacrifice for both of them by moving first and getting RTRT hardware to PCs, and I think the fact Navi's design must have been finalised by the time Turing launched if it's coming this year more or less confirms that they weren't too out of step under the hood, one just has some other cards to play and markets to focus on first.

As I thought also. Unlikely that Microsoft would not have been working closely with AMD on this unless they had planned to move away from their architecture for their consoles.

On the Nvidia side, it was probably a combination of bragging rights and at the time great finances to push these out as early as possible. Now though is a different story. They have lost the Tesla contract, their share prices are a shadow of their peak last year and they are looking at several class action lawsuits from investment banks and big investors, they no longer have cash to wave around and I think this will be reflected this year and depending on where they make the cuts (they will have to, you can't lose that much money and not make cuts, the accountants will be screaming for it).
 
Yep, and given it was confirmed during the stream that Microsoft and AMD will have some new console collaborations to announce soon that's that point wrapped up really. Some people forget that it takes 4-6 years to get an architecture from initial planning stages to consumers(And about 18 months to get a fully completed design to consumers, depending on how much stock build up they need), tech companies never react to each others technology releases directly nowadays really, they react to what they predicted the other companies tech releases to be half a decade ago, but ecosystems are so complex now that none of them would be doing anything without heavy collaboration with each other.
 
Last edited:
Yep, and given it was confirmed during the stream that Microsoft and AMD will have some new console collaborations to announce soon that's that point wrapped up really. Some people forget that it takes 4-6 years to get an architecture from initial planning stages to consumers(And about 18 months to get a fully completed design to consumers, depending on how much stock build up they need), tech companies never react to each others technology releases directly nowadays really, they react to what they predicted the other companies tech releases to be half a decade ago, but ecosystems are so complex now that none of them would be doing anything without heavy collaboration with each other.

The interesting thing about Turing is that the architecture is a lot more "AMD-like" than previous designs. It's not a coincidence that Turing's new Concurrent FP and INT Execution paths benefit the most in games where AMD would have previously held an advantage with GCN.

When the PS4 and Xbox One were both confirmed to use AMD hardware, Nvidia needed to work on getting their GPUs to have the same advantages as GCN. Turing is the result of that push, discounting the RT cores and Tensor cores, obviously.
 
The interesting thing about Turing is that the architecture is a lot more "AMD-like" than previous designs. It's not a coincidence that Turing's new Concurrent FP and INT Execution paths benefit the most in games where AMD would have previously held an advantage with GCN.

When the PS4 and Xbox One were both confirmed to use AMD hardware, Nvidia needed to work on getting their GPUs to have the same advantages as GCN. Turing is the result of that push, discounting the RT cores and Tensor cores, obviously.

Yeah Nvidia are back to tanks. After doing an AMD and spending loads on Fermi they are now back to the big stuff. AMD should have taken a hiatus like Nvidia and concentrated on what was important in the now, not five years from it.
 
Yeah Nvidia are back to tanks. After doing an AMD and spending loads on Fermi they are now back to the big stuff. AMD should have taken a hiatus like Nvidia and concentrated on what was important in the now, not five years from it.

Focusing on the future can often be more valuable than focusing on the now. Just look at Zen.

AMD cancelled all of the desktop Bulldozer variants after Piledriver and focused on Zen, and that can now be seen as a great move in retrospect. If AMD had more money it would be possible for them to do more, but we must remember how recently AMD got back into profit.

In the GPU market, Nvidia's Maxwell can be seen as their version of Intel's Sandy Bridge CPUs. Maxwell made some big changes and upended the GPU market, before then AMD and Nvidia offered performance/watt numbers that were much more similar than before.

My hope is that Navi is akin to Zen for the GPU market, a comeback design that brings AMD back in line with Nvidia. If the next-gen console also uses a similar design it will also be great news for AMD, especially if they integrate any form of AI or Ray Tracing into the mix (who know's what's planned at this point).
 
Focusing on the future can often be more valuable than focusing on the now. Just look at Zen.

When those things you have designed and planned for the future actually do something in the now then yes, it's a fantastic idea. However, pumping loads and loads of money into a GPU that is not being utilised properly or coded for exclusively results in the 7970/290X/FuryX/Vega. All of them very expensive to make, and very hot and guzzled power whilst not doing all of the things AMD planned.

At the same time Nvidia cut their die size in half or less, clocked the merry balls off of them (Kepler and on) and then sold them for far more money than they should have. And why? because AMD are making Fermi. And what is hilarious is that before they did all of that they made the 5870, which was a smash hit.

Like, if you take a Piledriver CPU *now*, like, right this minute and you overclock it and bench it? you basically have I7 920 IPC over 8 cores. Problem is that even after all of these years nothing is making use of them in gaming. Maybe a small handful of titles and certain levels in Crysis 3.

And now? it is hopefully outdated.

Zen is absolutely nothing like that at all, because the bottom line? it performs well, now, when you actually want it to. And that is why it is so damn good.

Vega is not good. You'll remember me saying (before that 7 logo was even shown) that if AMD knew what was good for them they would just give up and move on. But nope, looks like they are in it for the long run. Try and remember, this is RTG. The same RTG who played cloak and dagger with Vega and basically talked out of their a**es to sell it. "It looks great !. It's similar to a 1080Ti when you have a 60hz screen!" and so on. All of that crap that Lisa Su and the CPU department have not done.

Like I said before, RTG group needs a *huge* shake up, slaps, people getting fired etc. I wouldn't even mind if Vega offered anything truly good but it just doesn't. It's not very good at pretty much everything, not for the power it consumes or how much waste heat it dumps.
 
Lisa Su still leads the whole of AMD and took direct control over RTG after Koduri left.
GCN has been very successful in a lot of different markets for over half a decade now(Almost certainly the most used GPU architecture for gaming in history thanks to its versatility), just not always a top-end contender, while the HD7970 wasn't fully utilised in software & drivers at launch that improved a lot during its lifespan on sale, giving it a ridiculously long shelf life. The HD7970 competed quite well against the GTX680 about 6 months in to their life and often firmly beat it in price/performance, same with 290X vs 780Ti, it's only really the HBM GPUs that haven't been well positioned for gaming, but they were well positioned for a lot of other things and laid down AMDs knowledge & research on interposers & stacking that allowed the creation of ThreadRipper, Epyc & Ryzen3000(And may eventually help them again in similar ways with GPUs once the technology matures further). Both Vega1 and presumably Vega2 come in at only about 50W more than the 2080, with even Vega1 comparing well in many compute applications. I think you could argue GCN has been a jack of all trades master of none architecture, but given the skeleton budget the AMD group had until Ryzen that's not too surprising.

FWIW you can still play most modern games on an FX8320 or up without it majorly bottlenecking any GPU you'd sensibly pair with it, it still holds up as strongly with the 3570 as it did a year or two after its launch, maybe abit better in DX12/Vulkan titles. The Tahiti cards hold similar positions, a HD7970 still beats an RX560/GTX1050Ti quite often and supports modern game APIs properly, you can't say the same for the original Titan or 600/700 series cards(Which also stopped receiving driver updates many years ago).
 
Last edited:
Oh yeah you can still game on Piledriver. In fact it still meets the requirements on many games that Ivy Bridge doesn't. It just got hammered at the time by high IPC high clocked Sandy with 4 cores.

Because Intel were selling what we wanted, when we actually wanted it.

AMD need to stop being clairvoyant and get with reality, with things like Polaris. They don't need to be world beating just cheap,cheerful and obtainable.

People get confused a lot about exactly what cards fit into which resolution bracket. Polaris was more than good enough for any one on 1080p or even 1440p.

NVIDIA 1080 performance from a sub £250 GPU would be amazing. And they can do it, when they give up on their kitchen sink.
 
Back
Top