AMD's RDNA 2 Silicon, CU counts and clock speeds reportedly confirmed

But DLSS only works with Ray Tracing right? which brings us back to the "has to be coded for" problem. IE the game has to be RT in order for it to work.
 
But DLSS only works with Ray Tracing right? which brings us back to the "has to be coded for" problem. IE the game has to be RT in order for it to work.


Nope, DLSS is its own thing, It does not need RT to be present but it's fairly often included in RT games to lighten the load.
 
Yeah DLSS is its own thing, but still requires explicit game/developer support, just not for the neural network side of things now
 
Oh cool. Well hopefully AMD can sort something out. I would imagine they are any way for the consoles to use.
 
Oh cool. Well hopefully AMD can sort something out. I would imagine they are any way for the consoles to use.

Yep already under way and looking very impressive, AMD worked with Microsoft to develop the Direct ML API i.e Direct Machine Learning, Which is already compatible with RDNA1 i.e 5700XT etc... MS just need to do more fine tweaking.

If you open up GPU-Z you can see AMD's cards already fully compatible with DirectML -


aj90JRT.jpg



And here's a before and after picture of DirectML in action, It's basically DLSS but built into the OS with no proprietary hardware required, It's just up to devs to utilize it going forward when MS finalise it -

Cl3pzSD.png
 
Well if the consoles use it I would expect it to be a given.


Yep there are a few snippets of info floating around about the PS5 and XBSX using their own version of DLSS going forward i.e DirectML, Which is baked into the OS so potentially better performance.
 
^ Watch that channel's leaks about Ampere to learn all about how trustworthy he is. Over half was complete nonsense.
 
They didn't make up stuff or lie, but in classic corporate fashion only demonstrated best case scenarios.
 
I can somewhat agree but we were told the 3080 was twice the performance of a 2080 and best case or not the cases that out weigh that are many so it simply isn't twice the performance at all. Also the 3090 isn't 8k gaming unless you turn some settings down and even in some games turn some of the RTX off completely while using DLSS. Add to that your be expecting a best case scenario from the 3070 to be as good as a 2080ti so does that mean in CS GO I mean what best case are you expected to expect.

Last time I looked 100mph didn't then become a best case senario of going down hill, but was actually 70mph on flat road. All them shaders, ever wonder why they make little difference, it's cause when running games most of them are not even being used, it's situational.
 
Last edited:
That's how corporate BS works. It's always cherry picked data and pretty visualisations which are technically correct with a dozen asterisks. And Nvidia isn't unique in this regard, it's literally all press events, at least ones targeted at consumers. It is marketing team's job to make the product appear as appealing as possible without breaking advertising laws, or damaging company's reputation with something egregious.


That is still very different from presenting rumours and guesses as facts and straight up lying. I don't see how this is a complicated concept.
 
its not corporate BS. Any one will do it to sell their product.

If you are trying to sell something you created to the masses, of course you will pull out the positives and impressive data. Are you going to say "hey here are the facts from our product, best on the market, but ill be honest, after 2 years it will break with 100% certainty" ?

The benefits corporations have is their ability to get a team to bend the truth, so they do announce problems, but manage to play it down enough that we think its acceptable.
 
No but what you described is corporate BS - sad and difficult tones are ironed out until there's only happiness and superlatives left. But it still stands up to scrutiny, you won't find lies. If difficult questions are asked with potentially damaging responses, you'll get something along the lines "We are not aware of this, but we'll investigate! Thank you for your interest in Our Product(TM)!"


But there are good reasons for that, and it's just how the world goes around. Eventually you learn to read between the lines.
 
They didn't make up stuff or lie, but in classic corporate fashion only demonstrated best case scenarios.

Eh?

Of course they lied and made stuff up. The day before launch we got a load of bogus 7nm specs with cuda counts directly from Nvidia that were total rubbish.

They even supplied them directly to a tech journo friend of mine who was convinced they were real. Why? because they came from Nvidia marketing.

I actually think they thought that they could phish the RDNA2 specs out of AMD before Jen made the announcement. Of course they just held firm and released nothing.
 
Could you link that statement about 7nm? Closest I can find is an analyst from SIG. https://www.barrons.com/articles/nvidia-earnings-preview-nintendo-chips-buy-rating-51573590023


Or do you mean their A100 data center GPU? https://youtu.be/onbnb_D1wC8?t=182

The spec was released the day before the announcement. In fact, some of the board partners were so convinced they put the spec on a flyer. I am certain Mark posted here with them in.

Took me a while but here. These were released by Nvidia shortly before the real launch.

https://videocardz.com/newz/nvidia-geforce-rtx-3090-and-geforce-rtx-3080-specifications-leaked

To which some one made a joke about, and then a guy I know (tech journo) posted this in reply.

FjqWlxN.jpg


As he was fed the same info too.

Like I said, I reckon it was a bait and switch to get AMD to post what they had. That way Jen could adjust prices or what not, and beat them. But they remained utterly silent. Which is telling. Because the last time Nvidia went first (Volta) AMD ripped the pee out of them with their "Poor Volta" campaign. And as we know, Vega was crap.

This time however? no bait. No real specs. No BS, no ridiculous claims. That is very telling. It means they are not worried. At all, it would seem.

Any way, whatever it was (trolling, bait and switch etc, trying to get info out of AMD) it was still false information. TBH? the people who could have solidly disproved it was AMD LOL as they are in cahoots with TSMC, so they would have immediately known they were false. I am sure if Nvidia had gone to TSMC AMD would have known all about it. Also, that same guy who posted the "confirmation" of the spec also said TSMC are the biggest leakers on the planet.

As soon as a manu places an order for a couple of PCBs for protos it usually leaks all over the internet.
 
Last edited:
So you're linking a known rumour site which even itself states:


"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."
 
So you're linking a known rumour site which even itself states:


"The data that we saw clearly mention the 7nm fabrication node. At this time we are unable to confirm if this is indeed true."

It was "leaked" to every one. Several people made videos about it. The emails came from Nvidia. Marketing, natch.

veOHubh.jpg


See? even Gainward were convinced.

It had been said for ages and ages that Ampere would be 8nm Samsung. If you followed the rumour mill one guy in particular called those 7nm specs BS and said "Nope, it's definitely 8nm Samsung". However, it was enough to convince every tech journo that it was indeed 7nm and they must have ditched Samsung and gone to TSMC.

As I said before, phishing. Or just playing games? IDK.

BTW if you looked into the real reason why Ampere was crashing you would find that tech influencers like Jay and others had newer drivers than the companies making the effing cards. Why? because THAT is how paranoid Nvidia are about leaks of real info. They were testing cards to be sent out to OEMs and shops on older drivers that didn't boost as high under specific workloads, when the drivers sent out to reviewers were the ones people got on launch day which WERE boosting higher and thus the cards that were not capable of said boost clocks were crashing.

Look dude, I want to bury the hatchet here. Don't think I hate Nvidia for absolutely no reason at all and I am just an asshole about them for no reason. It has nothing to do with that AT ALL it's about how they act as a company. I've said before that even when their products are absolutely brilliant they mar it and ruin it by talking crap.

Had they avoided all of that crap? then I would have no reason to get p1553d off. None whatsoever, and I probably wouldn't dislike them AT ALL. But they keep doing it on every launch because they seem to think that their BS marketing and lying is what sells their GPUs. It absolutely isn't at all. In fact when you are passionate about all of this it's just effing annoying to continually see them doing it. I don't want to hear waffle, actions speak louder than words (or rather products).

If they had waited for stock, fixed the fact that many cards were boosting into instability and so on? the only argument I would have had is the archaic step back in power consumption. That would have literally been it.

And you know what? I would probably have one in my rig right now had they released it properly.
 
Last edited:
Back
Top