AMD plans to launch Zen 3 and RDNA 2 this October at separate events

"Ashes of the Singularity

Similarly RoGame unearthedRTX 3080 AoTS scoresin the game's online benchmark result database. AoTS is a DirectX 12 benchmark that has fallen out of favour with users and reviewers but RoGame has gathered some comparisons from colleagues.

In the 'Crazy' 4K quality preset the RTX 3080 managed 88.3fps in AoTS. This compares against an overclocked MSI Gaming X Trio RTX 2080 Ti at 80.29fps, an Asus Dual 2080 Ti with water cooler at 81.55fps, and a 'stock' RTX 2080 Ti which achieved 69.99fps. Meanwhile the Amd Radeon RX 5700 Xt scored 45.5fps in the same 'Crazy' 4K tests."

Source:https://m.hexus.net/tech/news/graphics/145318-nvidia-geforce-rtx-3080-geekbench-aots-scores-leak/

Personally I'm looking forward yo seeing what AMD bring, if what we already know of 80cu's is right, and then arch improvements on top of RDNA 1....we should be in for a good fight
 
RDNA2 event on October 28th, really? After RTX 3070 releases, just to benefit the consoles launch on November 10th... I don't know if this nasty movement will work.
 
Either AMD are extremely confident in RDNA2 hence why they are launching it over a month after Nvidia launches Ampere or it's going to be yet another disappointment, I'll wait and see what they have but I'm not holding my breath.
 
AMD will have no real choice on the RDNA2 reveal date because they can't really do it without spilling the beans/breaking NDA on console performance.
 
Either AMD are extremely confident in RDNA2 hence why they are launching it over a month after Nvidia launches Ampere or it's going to be yet another disappointment, I'll wait and see what they have but I'm not holding my breath.

So it's going to be good or bad. LOL cover all the bases there dude.
 
AMD will have no real choice on the RDNA2 reveal date because they can't really do it without spilling the beans/breaking NDA on console performance.

True, Just a shame it's so late.

So it's going to be good or bad. LOL cover all the bases there dude.

I try ^_^

I want RDNA2 to be amazing, Especially amazing value for money but I'm betting on 3070 performance, If it is then great but it has to be cheaper than a 3070 as AMD do not have DLSS and the vast amounts of software integration that Nvidia has.
 
Last edited:
It doesn't need to be amazing. It just needs to be cheap.

You've not seen a single bench for the 3070 either, so I would be very careful feeding on NV's BS.

I can categorically tell you right now it won't be as fast as the 2080Ti. It's VRAM and bandwidth starved for a start.

There's also a reason why the 3080 has been pitted against the 2080 and not the Ti. Why would you not do that? you know? smash the Ti with the 3080 and make it look even more impressive.

Until DLSS goes into every game? I wouldn't be concerned about that. Remember one thing; Navi is in both of the consoles. So whatever trickery they use? is a given for Navi and big Navi.

So the odd one out will still be Nvidia and their tech.

We know how great things can be when Nvidia court devs. We've seen it in the past with SLi and so on. However, we also know what happens when they can no longer be arsed.

But yes, at 4k the 3070 will be a farce.

BTW if you don't believe me.. I read an article yesterday about this new "doubling" of the CUDA cores. There's one issue with it. It needs a heavy workload to shine (like Turing) and it needs memory bandwidth. Otherwise? loads of those SMs sit doing nothing.

That doesn't bode well for the 3070 really. Especially given that its purpose is 4k. OK OK I hear you "No it's not etc". Well, take a read of what happens to those SMs at lower resolutions (IE why the XP, 2080Ti and so on were disappointing at lower resolutions). It's because those SMs need a work load, and there is no better way to do that than 4k. However, 8gb is not enough and neither is its bandwidth.

So there is your catch. We've also seen absolutely nothing on the 3090 yet either. Why? why not show it pulverising everything else into mush?

I would seriously wait. Wait until you have the real facts. That said unless you are mad enough to pre order one on what Nvidia have fed you? you won't be getting one before RDNA2 comes out any way.

One more thing BTW.

Every one has said RDNA 2 will be "OK" for ages now. The only "leak" we had was that it was as good as a 2080Ti right? there is no other info anywhere.

Just bear one thing in mind. Big Navi is TSMC, who make *much* better dies than Samsung. I also find it veeeeeery strange that AMD have remained absolutely effing silent throughout this, where as when Vega was coming they began a campaign of BS to stop it being a total disaster.

Look, if the wind has blown the other way? you can expect big things out of big Navi. Ampere is miles and miles from being what it could have been on TSMC.
 
Last edited:
You really dont like Nvidia do you :P

I think AMD have shot themselves in the foot in a way, but i suppose a good way. They cannot risk any thing with leaking information about their console chips.

Its in both PS and XB. Imagine leakage comes out, then we find that something has been tweaked or tailored to favour the PS5. MS would have a field day about it. Until the consoles are almost on our door step they have to keep quiet.

I will say im impressed that they have managed the silence given all the tech sites paying handsomely for snippets of info that they leak to the world.
 
No dude I don't like Nvidia. I've never hidden that fact, nor pretended otherwise.

I like RTG about as much tbh. The only good thing they have made in years was Polaris, and the knobs behind the desks STFU and Raja did it properly.

And yeah, I highly suspect the delay is because of the consoles, COVID and gawd knows what else.

Had Nvidia not talked so much BS? then fair enough. I just absolutely abhor BS.

They paid Eurogamer for that "preview".

BTW I never let a company pull the wool over my eyes. I'm usually always consistent. However what Nvidia are playing at is just wrong. It's amazing how many people they have brainwashed so far.

They've had a bad day at the office with Ampere. They know it, we know it. Samsung 8nm sucks just as much now as it always did. If AMD have had a good day at the office on TSMC? then I would be careful about condemning big Navi just yet.

Remember that is all it took with Ryzen. A new design, a crap load of luck and a genius. However, that wasn't enough alone. It took tons of bad luck from Intel too.

I've made it very clear now (unless someone didn't notice) that I categorically DO NOT like RT. Not how it is currently done. I asked a friend who is very gifted in knowledge and he explained it as such.

The grain is due to the 'sampling' resolution it has to do for raytracing, as more samples are possible (at reasonable framerates) it should lessen or disappear entirely. Realtime raytracing is still very early doors but it isn't about to go away. Another few more card generations and I predict it becomes the norm just because it makes building games easier than rasterizing (and devs are lazy people)

See? so basically what he is saying is that it is supposed to look like that (unacceptable to me, far too off putting) and it will be a few generations until it goes away. TBH? it looks like crap, so the sooner it goes away the better.

DLSS? I doff my cap. I think it's amazing, though it too had a really bad start (it looked terrible).

The info we have about big Navi is incredibly stale now. It's over a year since we heard it would be as good as a 2080Ti. Then all of a sudden they release this console spec with a APU in it that at least on the surface appears to be as good as a 2080S.

Yeah. Like I said, don't write off big Navi yet.
 
Last edited:
AMD will have no real choice on the RDNA2 reveal date because they can't really do it without spilling the beans/breaking NDA on console performance.

Not really related. They can do it whenever they want. Console specs are already released. Whether AMD releases it today or one day before Consoles launch has 0 bearing on console agreements. Performance won't even be the same. It's semi-custom silicone in a market where software is completely different than a standard windows and hardware environment. If it was the exact same, it would be a prebuilt PC.

If it would breach their contract to spill anything before they launched they would launch after Consoles.
 
Last edited:
It doesn't need to be amazing. It just needs to be cheap.


I'd rather go with AMD but their top end RDNA2 card needs to be either 3070 or 3080 performance with a damn good price tag, As I've said previously, Nvidia has a metric butt ton of software integrations, An immense software suite, Game features, DLSS2 which is pure voodoo and DLSS3 coming out which is voodoo on magical steroids, RT I really don't care about, Just give me rasterisation horse power that makes the 5700XT look truly last gen in every aspect.
 
Last edited:
I'd rather go with AMD but their top end RDNA2 card needs to be either 3070 or 3080 performance with a damn good price tag, As I've said previously, Nvidia has a metric butt ton of software integrations, An immense software suite, Game features, DLSS2 which is pure voodoo and DLSS3 coming out which is voodoo on magical steroids, RT I really don't care about, Just give me rasterisation horse power that makes the 5700XT look truly last gen in every aspect.

First off I will repeat, THERE ARE NO performance metrics for the 3070 AT ALL right now. Just some BS chart.

Look dude, IDK how much you have studied GPUs but I can tell you that 4k is *all about* memory bandwidth. Like, nearly everything relies on throwing around textures. Also on/off loading textures. The faster the RAM the more the GPU core gets to work out.

With the 3070 not only being hobbled to 8gb (I can use 9.5 @ 1440p in COD FFS) the bandwidth is also savagely cut.

So that means it will be a 1440p card. However, then you run into the issue that without 4k barraging the double CUDA SMs they find themselves waiting on the memory.

I can categorically state, it will NEVER be as fast at raw performance as the 2080Ti. It only needs to do one thing better at one resolution and they can spout their BS.

See BL3? why it's performing so much better on the 3080 than the 2080 @ 4k? MEMORY BANDWIDTH.

It's what Nvidia have used to cheat the numbers and make the 3080 look so good. Those tests on Eurogamer were cherry picked and loaded, and Eurogamer were paid to do that.

It's what you call buying benchmarks.

I reckon overall? maybe the 3080 will be the predicted 30% or so faster that we thought. But, due to Jen's brain washing BS we have no way of knowing.

I know people have been taking the pee out of me for banging on about memory and bandwidth but trust me, at 4k it is absolutely everything. And all of these cards are supposedly 4k cards, because the new consoles are. Whether you even need it or not.

The only one thing I find nearly as annoying as Nvidia's BS is people who buy the wrong GPU for the wrong application. It really, really chuffs me off. You are just wasting your money.

The 5700 and XT were good cards. Good enough to make Jen release a new range with 30% lower prices. And they are old dude (the 5700 and XT).That was over a year ago now.

Think about it like this. In a year they could have improved the performance by 10% on "IPC" right? then double the size. Add in more VRAM than Nvidia (higher bandwidth = faster) and at 4k? they could well be a lot better than any one is anticipating. Because again, they are going to be 4k cards.
 
Not really related. They can do it whenever they want. Console specs are already released. Whether AMD releases it today or one day before Consoles launch has 0 bearing on console agreements. Performance won't even be the same. It's semi-custom silicone in a market where software is completely different than a standard windows and hardware environment. If it was the exact same, it would be a prebuilt PC.

If it would breach their contract to spill anything before they launched they would launch after Consoles.
Nah, AMD are literally working with each of their customers competitors, either company would've have been insane not to put the performance of their consoles under any wide ranging NDA, and these NDAs are always written to be loose enough to rule out any chance of "hinting", doesn't matter if the figures aren't actually directly comparable in a strictly technical sense, revealing the ballpark performance of RDNA2 CUs would definitely be a huge hint to end performance numbers of the consoles now we know the configs, enough of a hint to make the press go mad anyway. Also doesn't matter when the consoles launch, NDA's in this scenario are to ensure the companies have control over when they release the info publicly, not to keep it an eternal secret, think we can assume AMD have selected a date after the juicy details of both consoles are all in the open, guess we can wait and see.

Silicone is the rubbery material they make fake breasts out of btw.
 
Last edited:
I'm waiting for all sides and all offical benchmarks by ttl and others on all the cards.

But there are some benchmarks getting leaked about the 3070 and it looks very on par with a 2080ti thou your right alien the 3080 is around 30-35% faster than a 2080ti.

I think it comes down to expectations, either for or against the viewpoint you don't know any better than anyone else until we see the factual benchmarks to KNOW.

I do expect navi to be good, but how good idk but i am hopeful if that gets crushed then so be it, but if navi on consoles is meant to be 4k 60/30fps then why wouldn't a 3070.

Something i've always felt in many things is you shouldn't judge the future from the past, times change and the past can tell a story and a lesson, but if learnt right the future is different to what you'd expect.

So until we know all we can do on all sides is speculate and chin wag about it.

My expectation for navi is that it will be mostly similar in performance as the 3000 series at least on the cards that matter but have more ram and come at a cheaper price with pretty solid DXR DX12 Ultimate. What really i feel is going to set them apart is the software stack cause the streaming app of the 3k series i feel is pretty dam nice not needing a green screen, but also DLSS. So AMD need to have made efforts in software areas to bolster against nvidia and i think the reason that AMD have been so quiet is they are not wanting to make the mistakes of the past by over promising.

I think the software side is the biggest issue they have, but if they can get the rest right then it's a viable option, it's at least the most intresting situation for a fair few years in GPU's and it's not just a node shrink it's a generational shift, but they are coming late to the party so imho they should have moved the announcements forward to at least let people know all the info it'll be lost sales without doubt and nvidia will sellout very fast.
 
Last edited:
TBH IMHO this is a Stupid move from AMD, and as far as I'm concerned means there's no way in hell they can compete and they know it so they're delaying it, everyone's just going to go Nvidia as it's just too long a wait for those instant gratification nuts.
We know the CPU's will be good that's why they're doing those first and the GPU's will be yet another overhyped NOTHING!
Really wanted it to be different but it won't be till maybe next year but as usual Nvidia will be Xtimes better as well making it way harder for AMD to catch up..............Disappointed :(
 
Last edited:
Doesn't sound like AMD has anything to rush about if Ampere stock rumours are true tbf, sounds like we've got another Turing esque paper launch coming without proper stock till Christmas.
 
Back
Top