Nvidia CEO Calls Radeon VII "Underwhelming" and says "The Performance is Lousy"

With AMDs "next gen" arch and Intel coming into the scene next year he is going to be facing tough times ahead.


Yep and considering this is basically just a Vega refresh I'd say up to 45% faster performance is damn good all things considered.
 
RTX is a hardware implementation of an open industry standard API (DXR), games that use it shouldn't need much more vendor specific support beyond the unique codepaths already required by DX12 in order to support other hardware implementations of this industry standard. DXR will probably end up being a reasonable thing for NVidia to have bet on, it's got a lot of support from engine & game developers, the problem for NVidia is none of them seem to be rushing to get the software out anytime soon(Likely because they want to launch with cross-vendor implementations, AMDs stated they're already deep into DXR software implementation work with devs), so now it's technically been out for several months and nothing really supports it, by the time games reach double digits it might not be an NVidia exclusive feature for much longer, and suddenly NVidia could miss their chance to cash in on the "exclusivity". Therefore, pressuring buyers to get in early is almost a necessity for Turings success if they think Navi will bring DXR support, as with both a node disadvantage and no major exclusive technology the RTX cards wouldn't have much to stand on much longer.
 
Last edited:
I'll take some credit for posting this in quick news:p

I mean I believe he did it to poke fun at his niece Lisa but way he did made him sound like an a hole. Little to far. I don't care personally but doesn't seem professional.
He could have said I believe there 7nm Vega is still not as good as our 2080 and I am confident we deliver a better product.
 
Yeah I don't like it either. We always made fun of AMD for doing stupid stuff we should do the same for Nvidia


AMD used to do it out of fun and no one really took it seriously as it was funny but Jensen seems to be doing it out of spite, He doesn't like competition.
 
One thing I will say is that his mouth is writing cheques he can't cash. What I mean is that basically he is at the mercy of game devs and they are all coding for.... AMD based consoles.

So he can say WTF he likes but if he doesn't get support he will wind up with egg on his face.

And let's face it, most of his company's previous efforts have fallen flat. SLi, 3DVision etc. All lacked support. So if there are hardly any RTX games then his cards will be an epic waste of money.

I agree. The console market is the focal point for the majority of game developers so game support for ray tracing is going to be very limited unless/until the ability to use it comes to consoles, the next-gen of which will be AMD based anyway with PS5 and XBox*insert random because their nomenclature makes no sense*.

I have no doubt that AMD will join the RT party soon enough just for the marketing buzzwords, but still personally think it's going to be a niche "feature" for while.
 
RTX is a hardware implementation of an open industry standard API (DXR), games that use it shouldn't need much more vendor specific support beyond the unique codepaths already required by DX12 in order to support other hardware implementations of this industry standard. DXR will probably end up being a reasonable thing for NVidia to have bet on, it's got a lot of support from engine & game developers, the problem for NVidia is none of them seem to be rushing to get the software out anytime soon(Likely because they want to launch with cross-vendor implementations, AMDs stated they're already deep into DXR software implementation work with devs), so now it's technically been out for several months and nothing really supports it, by the time games reach double digits it might not be an NVidia exclusive feature for much longer, and suddenly NVidia could miss their chance to cash in on the "exclusivity". Therefore, pressuring buyers to get in early is almost a necessity for Turings success if they think Navi will bring DXR support, as with both a node disadvantage and no major exclusive technology the RTX cards wouldn't have much to stand on much longer.

It still needs to be done dude. IE, a developer needs to spend the time it takes to make their game work with RT. If most of your audience (two major consoles and regular PC users) don't use it would you put the funding into it? because I wouldn't, and going from the past 20 years it's quite clear most devs wouldn't either.

Like, some games just did not work in SLi or Crossfire (mostly because of no AFR support at all) and there was little to nothing you could do about it. As soon as DX12 drops we've seen pretty much no games support either technology at all. Maybe because of like you said the other day (that people are now obsessed with frame times etc) or because game devs simply could not be bothered.

I mean, you would really have to care about Nvidia using PC gamers to spend that sort of a budget on coding it in wouldn't you? and then you appeal to a tiny segment of the gaming community because the rest simply can't use it. And it's the same for DLSS. That is not only Nvidia specific, but Nvidia 20 series specific. So once again it needs support.

At least AMD have both consoles under their belt (a much higher prize than a single range of GPUs on the PC) and if they use their brains and put their own version of RT into consoles and thus making it 0 extra work for it to work on their GPUs then they will definitely be in front when it comes to support.

Nvidia either have to woo game devs into doing it, or, have to make it financially worthwhile. I mean, it's obvious they have the cash to woo with, but will they? it's not looking so promising right now.

We'll see I guess. Impossible to know what either company will do over the next few years.
 
It still needs to be done dude. IE, a developer needs to spend the time it takes to make their game work with RT. If most of your audience (two major consoles and regular PC users) don't use it would you put the funding into it? because I wouldn't, and going from the past 20 years it's quite clear most devs wouldn't either.

Like, some games just did not work in SLi or Crossfire (mostly because of no AFR support at all) and there was little to nothing you could do about it. As soon as DX12 drops we've seen pretty much no games support either technology at all. Maybe because of like you said the other day (that people are now obsessed with frame times etc) or because game devs simply could not be bothered.

I mean, you would really have to care about Nvidia using PC gamers to spend that sort of a budget on coding it in wouldn't you? and then you appeal to a tiny segment of the gaming community because the rest simply can't use it. And it's the same for DLSS. That is not only Nvidia specific, but Nvidia 20 series specific. So once again it needs support.

At least AMD have both consoles under their belt (a much higher prize than a single range of GPUs on the PC) and if they use their brains and put their own version of RT into consoles and thus making it 0 extra work for it to work on their GPUs then they will definitely be in front when it comes to support.

Nvidia either have to woo game devs into doing it, or, have to make it financially worthwhile. I mean, it's obvious they have the cash to woo with, but will they? it's not looking so promising right now.

We'll see I guess. Impossible to know what either company will do over the next few years.

This !

Devs aren't going to spend oodles of money to implement something ingame that maybe 0.01% of their user base can make use of, It makes no sense from a business perspective.

When the hardware becomes cheaper to manufacture and it can be adopted by the masses then yeah sure but as of right now RT is even more niche than Physx in the Batman games.
 
It seems that in many cases Nvidia at least partially foots the bill for the extra development cost to make rtx range more attractive. Just like gameworks and physx.
 
This !

Devs aren't going to spend oodles of money to implement something ingame that maybe 0.01% of their user base can make use of, It makes no sense from a business perspective.

When the hardware becomes cheaper to manufacture and it can be adopted by the masses then yeah sure but as of right now RT is even more niche than Physx in the Batman games.

Do you know what made it all sink in and hammered it home?

The last two PC games I have installed and ran have started up and immediately gone into the game @ 720p. I have to sit and watch the first intro or cut scene and actually get the game going before I can even change the resolution or any of the graphics settings. That is how little they care about the PC gamer having the optimum experience. They can't even be bothered to code in a screen before the game begins so you can change the settings.

I have no faith in "fad" techs that are not universal across the board. They never ever ever survive or make any impact. Look at Gsync. Nvidia are now supporting it on any monitor. That's because they failed to corner the market with it (quelle surprise !).

They can not win the RT battle if that is what Sony and Microsoft intend to do with future consoles. They will simply be crushed. As powerful as they may think they are when it comes to Microsoft and Sony they are small fry.
 
Do you know what made it all sink in and hammered it home?

The last two PC games I have installed and ran have started up and immediately gone into the game @ 720p. I have to sit and watch the first intro or cut scene and actually get the game going before I can even change the resolution or any of the graphics settings. That is how little they care about the PC gamer having the optimum experience. They can't even be bothered to code in a screen before the game begins so you can change the settings.

I have no faith in "fad" techs that are not universal across the board. They never ever ever survive or make any impact. Look at Gsync. Nvidia are now supporting it on any monitor. That's because they failed to corner the market with it (quelle surprise !).

They can not win the RT battle if that is what Sony and Microsoft intend to do with future consoles. They will simply be crushed. As powerful as they may think they are when it comes to Microsoft and Sony they are small fry.

Well said !

You only need look at Physx, While yes it is very pretty watching heavy smoke dripping out of a gas pipe, It was only implemented in mainly the Arkham games and you needed the top Nvidia card to properly make use of it's performance destroying beauty.
 
Well said !

You only need look at Physx, While yes it is very pretty watching heavy smoke dripping out of a gas pipe, It was only implemented in mainly the Arkham games and you needed the top Nvidia card to properly make use of it's performance destroying beauty.

Physx peaked quite quickly tbh. For me? it was amazing, and still is amazing, in Mirror's Edge. It really did add to the game. Mafia II was also very good with it as well.

Then it died. I mean yeah, it's now used quite a lot, but never to the degree it was then when people actually sat down and coded with it in mind. See the pattern? IMO Physx PPUs could have really changed the game, but they were expensive and needed exclusive coding, that was pretty much useless to anything else.

Then we also have to face facts that PC gaming is dying down again. Well, gaming in general is going through quite a large slump. Companies are going under, and those still around are just taking the pee. Like I said, £50 for a game I have to have spoiled because I can't enjoy the opening screens? do me a favour. I don't want to have to wait for the game to bloody start and then change the settings by pausing the action. That's really, spectacularly lazy.
 
Physx peaked quite quickly tbh. For me? it was amazing, and still is amazing, in Mirror's Edge. It really did add to the game. Mafia II was also very good with it as well.

Then it died. I mean yeah, it's now used quite a lot, but never to the degree it was then when people actually sat down and coded with it in mind. See the pattern? IMO Physx PPUs could have really changed the game, but they were expensive and needed exclusive coding, that was pretty much useless to anything else.

Then we also have to face facts that PC gaming is dying down again. Well, gaming in general is going through quite a large slump. Companies are going under, and those still around are just taking the pee. Like I said, £50 for a game I have to have spoiled because I can't enjoy the opening screens? do me a favour. I don't want to have to wait for the game to bloody start and then change the settings by pausing the action. That's really, spectacularly lazy.

Not forgetting to mention minus games like RDR2, God of War and Spiderman, We are now paying more for less.

Put any decent single player game from a few years ago, Dragon Age Origins for example, Up against modern day games like Battlefront 2, Even Dragon Age Inquisition, More emphasis is put on the graphics than on the gameplay and you can see it is having a detrimental effect as people are getting burned out.
 
Not forgetting to mention minus games like RDR2, God of War and Spiderman, We are now paying more for less.

Put any decent single player game from a few years ago, Dragon Age Origins for example, Up against modern day games like Battlefront 2, Even Dragon Age Inquisition, More emphasis is put on the graphics than on the gameplay and you can see it is having a detrimental effect as people are getting burned out.

What we are getting, basically, is console games on our PCs. It took me buying and using a Xbox One X to realise that. They are doing practically nothing apart from running the console versions at lower settings. That's it. They make the game, then tweak the settings to make the FPS they need to and then they release both without spending ages making sure the PC version actually works. If it doesn't? they fix it afterwards.

At least with RDR2 we know that they will try hard to make it better on PC, but that it will actually work. Rockstar have a good history with taking their time and getting it right. Max Payne 3 was superb on PC, as is GTAV (not my sort of game but I appreciate what they did tbh).
 
Yep and considering this is basically just a Vega refresh I'd say up to 45% faster performance is damn good all things considered.

Yup that's what I was thinking. Only part about Radeon VII I was kinda dissapointment in was price. Seems like a good card though.
 
Which is supported by how many games?


Nvidia's been dealing in gimmicks for well over a decade now. DLSS is no different. Just a fancy upscaling technique. And Nvidia likes to commit an occasional fraud every once in a while, let's not forget about that. That's why I really can't stand them as a company and why I'm constantly rooting for AMD to come up with something at least decent enough that I don't have to give my money to Nvidia.

I find DLSS underwhelming haha
 
It still needs to be done dude. IE, a developer needs to spend the time it takes to make their game work with RT. If most of your audience (two major consoles and regular PC users) don't use it would you put the funding into it? because I wouldn't, and going from the past 20 years it's quite clear most devs wouldn't either.

Like, some games just did not work in SLi or Crossfire (mostly because of no AFR support at all) and there was little to nothing you could do about it. As soon as DX12 drops we've seen pretty much no games support either technology at all. Maybe because of like you said the other day (that people are now obsessed with frame times etc) or because game devs simply could not be bothered.

I mean, you would really have to care about Nvidia using PC gamers to spend that sort of a budget on coding it in wouldn't you? and then you appeal to a tiny segment of the gaming community because the rest simply can't use it.
Look at the list of developers already committed to DXR support and you'll see that it already is being done across the board, just not in any hurry, but more importantly in todays world many games use engines like Unreal or Unity, or some larger studios like EA have cross-genre in house engines like Frostbite, these are also all getting DXR support which will have a knock on effect, while it'll still take work from game devs to effectively utilise it within their game most of the work is done by the engine devs and once DXR is more mature that will become a much better understood and simpler process. While it might only be one GPU vendor now, it doesn't seem like it'll be like that at the end of the year from what we've been hearing, and I think there's a good chance any 2020 or onwards consoles will have it as a side effect of that.

We've seen a similar situation with DX12, it's only once engine developers have added support for it that it's really started to become common place, you could say similar for DX11, and this time hardware vendors have as much reason to push for adoption as Microsoft do(and are, Microsoft's primary product is and always has been APIs) because of the hardware cost in terms of resources.
 
Last edited:
Back
Top