AMD plans to launch Zen 3 and RDNA 2 this October at separate events

Doesn't sound like AMD has anything to rush about if Ampere stock rumours are true tbf, sounds like we've got another Turing esque paper launch coming without proper stock till Christmas.

Nope it means they can't compete as no business in their right mind would give their competition one and a half months before announcing anything, regardless of stock levels, they've seen what Nvidia have realize they can't compete so are just not bothering.
 
I can categorically tell you right now it won't be as fast as the 2080Ti. It's VRAM and bandwidth starved for a start.

Some benchmarks have leaked for the 3080. So 3070 will be interesting to see.

3DMark Results:

3DMark Fire Strike Performance: 31919 (+25% 2080Ti, +43% 2080S )
3DMark Fire Strike Extreme: 20101 (+24% 2080Ti, +45% 2080S )
3DMark Fire Strike Ultra: 11049 (+36% 2080Ti, +64% 2080S )
3DMark Fire Strike Time Spy: 17428 (+28% 2080Ti, +49% 2080S )
3DMark Fire Strike Time Spy Extreme: 8548 (+38% 2080Ti, +59% 2080S )
3DMark Fire Strike Port Royal: 11455 (+45% 2080Ti, +64% 2080S )
4K in-game benchmarks. The card appears to be 48 to 62% faster than RTX 2080 SUPER:

Far Cry 5 +62%
Borderland 3 +56%
AC Odyssey +48%
Forza Horizon 4 +48%
 
Nope it means they can't compete as no business in their right mind would give their competition one and a half months before announcing anything, regardless of stock levels, they've seen what Nvidia have realize they can't compete so are just not bothering.
Surely AMD must be pretty confident they can price the cards right if they're not in any rush. Possibly both companies thinking they'll sell all their current stock by Christmas regardless?
 
Surely AMD must be pretty confident they can price the cards right if they're not in any rush. Possibly both companies thinking they'll sell all their current stock by Christmas regardless of launch date?

Or pride has taken hold of AMD and they dont want to release too soon and appear desperate in any way.
 
Some benchmarks have leaked for the 3080. So 3070 will be interesting to see.

3DMark Results:

3DMark Fire Strike Performance: 31919 (+25% 2080Ti, +43% 2080S )

I really hope that's not the graphics score as I got 36,578 with an overclocked 2080 Ti in that same test.
 
Some benchmarks have leaked for the 3080. So 3070 will be interesting to see.

3DMark Results:

3DMark Fire Strike Performance: 31919 (+25% 2080Ti, +43% 2080S )
3DMark Fire Strike Extreme: 20101 (+24% 2080Ti, +45% 2080S )
3DMark Fire Strike Ultra: 11049 (+36% 2080Ti, +64% 2080S )
3DMark Fire Strike Time Spy: 17428 (+28% 2080Ti, +49% 2080S )
3DMark Fire Strike Time Spy Extreme: 8548 (+38% 2080Ti, +59% 2080S )
3DMark Fire Strike Port Royal: 11455 (+45% 2080Ti, +64% 2080S )
4K in-game benchmarks. The card appears to be 48 to 62% faster than RTX 2080 SUPER:

Far Cry 5 +62%
Borderland 3 +56%
AC Odyssey +48%
Forza Horizon 4 +48%

And once again that lines up perfectly with the predicted 30%. Ignoring the RT ones, of course.

Edit. Remember, gaming performance will jump from the GDDR6X, too.

XFoz8QE.jpg


That was my card at stock, which was around 1800mhz IIRC

I really hope that's not the graphics score as I got 36,578 with an overclocked 2080 Ti in that same test.

I remember when Maxwell released. The 980 was "twice as fast" as the 780Ti. It turned out that it absolutely wasn't, at all. Maybe 5% at best. However, it was twice as fast at one thing that had sod all to do with gaming, but every one ate it up as usual.

Maybe now you can see why I am so jaded. Not angry, as some would assume. Just jaded after many many years of smelling the BS.

Remember, if the 3080 really was twice as fast as a 2080 he would want paying for it. Sure there's a slight discount in there, but your true successor to the 2080Ti costs.... Wait for it..... The same !

Now sure, you will get a performance bump going from a 2080Ti to a 3080. There's no doubt. However, that performance bump may only be worth it at 4k, in certain scenarios. None of which I particularly care about.
 
Last edited:
Someone just stated the obvious.

The 3080 is 30% faster and eats, you guessed it, 30% more power.

250w for the Ti, 320 for the 3080.
 
But yes, at 4k the 3070 will be a farce.

BTW if you don't believe me.. I read an article yesterday about this new "doubling" of the CUDA cores. There's one issue with it. It needs a heavy workload to shine (like Turing) and it needs memory bandwidth. Otherwise? loads of those SMs sit doing nothing.

That doesn't bode well for the 3070 really. Especially given that its purpose is 4k. OK OK I hear you "No it's not etc". Well, take a read of what happens to those SMs at lower resolutions (IE why the XP, 2080Ti and so on were disappointing at lower resolutions). It's because those SMs need a work load, and there is no better way to do that than 4k. However, 8gb is not enough and neither is its bandwidth.

...

I would seriously wait. Wait until you have the real facts.


Maybe you should apply your own logic. ;) :D


I'm waiting for all sides and all offical benchmarks by ttl and others on all the cards.

But there are some benchmarks getting leaked about the 3070 and it looks very on par with a 2080ti thou your right alien the 3080 is around 30-35% faster than a 2080ti.

I think it comes down to expectations, either for or against the viewpoint you don't know any better than anyone else until we see the factual benchmarks to KNOW.

I do expect navi to be good, but how good idk but i am hopeful if that gets crushed then so be it, but if navi on consoles is meant to be 4k 60/30fps then why wouldn't a 3070.

Something i've always felt in many things is you shouldn't judge the future from the past, times change and the past can tell a story and a lesson, but if learnt right the future is different to what you'd expect.

So until we know all we can do on all sides is speculate and chin wag about it.

My expectation for navi is that it will be mostly similar in performance as the 3000 series at least on the cards that matter but have more ram and come at a cheaper price with pretty solid DXR DX12 Ultimate. What really i feel is going to set them apart is the software stack cause the streaming app of the 3k series i feel is pretty dam nice not needing a green screen, but also DLSS. So AMD need to have made efforts in software areas to bolster against nvidia and i think the reason that AMD have been so quiet is they are not wanting to make the mistakes of the past by over promising.

I think the software side is the biggest issue they have, but if they can get the rest right then it's a viable option, it's at least the most intresting situation for a fair few years in GPU's and it's not just a node shrink it's a generational shift, but they are coming late to the party so imho they should have moved the announcements forward to at least let people know all the info it'll be lost sales without doubt and nvidia will sellout very fast.

Good comment.
 
I already have. 8gb is not enough VRAM for 4k. Game over..

In how many games? Of the hundreds of popular titles being played right now, how many categorically need more than 8GB of VRAM at 4k? I can count how many on one hand I believe. But I can't count how many games are being played right now on one hand. And how many people are actually playing at 4k? And how many people playing at 4k are cranking all the settings that high? The number dwindles and dwindles.

You're taking a handful of games and extrapolating that to mean everything. I don't think 8GB is an ideal amount, and that's possibly RDNA2's shining dagger against the 3070 and 3080. But I don't see what you see, and I don't see nearly evidence to persuade me otherwise. I explained to you how consoles aren't always the dictator that you presume, that the PS4 and Xbox One had 8GB of total RAM available and yet 4GB cards are only JUST becoming problematic now six years after the consoles were released. I could be wrong, but that's evidence to your idea.

I'm not saying that 8GB is absolutely enough. I don't think it's ideal at all. I wouldn't buy an 8GB card right now with performance levels at a 2080Ti. But I'm applying your logic and waiting for facts. What you're telling me is not a fact. It's your extrapolation, just like mine. If I see regular titles tanking the 3070's VRAM capacity within the next year, your extrapolations were correct. If I don't, your extrapolations were incorrect. No big deal. We're all just throwing ideas around for the fun of it after all.
 
In how many games? Of the hundreds of popular titles being played right now, how many categorically need more than 8GB of VRAM at 4k? I can count how many on one hand I believe. But I can't count how many games are being played right now on one hand. And how many people are actually playing at 4k? And how many people playing at 4k are cranking all the settings that high? The number dwindles and dwindles.

You're taking a handful of games and extrapolating that to mean everything. I don't think 8GB is an ideal amount, and that's possibly RDNA2's shining dagger against the 3070 and 3080. But I don't see what you see, and I don't see nearly evidence to persuade me otherwise. I explained to you how consoles aren't always the dictator that you presume, that the PS4 and Xbox One had 8GB of total RAM available and yet 4GB cards are only JUST becoming problematic now six years after the consoles were released. I could be wrong, but that's evidence to your idea.

I'm not saying that 8GB is absolutely enough. I don't think it's ideal at all. I wouldn't buy an 8GB card right now with performance levels at a 2080Ti. But I'm applying your logic and waiting for facts. What you're telling me is not a fact. It's your extrapolation, just like mine. If I see regular titles tanking the 3070's VRAM capacity within the next year, your extrapolations were correct. If I don't, your extrapolations were incorrect. No big deal. We're all just throwing ideas around for the fun of it after all.

I get his point but its overly exaggerated. 8gb VRAM has been tested on ultra settings at 4k in many games on a 2080 and runs fine. On an extreme scale towards lenient, Far cry 5 hardly needs 8gb at 4k. However you can cherry pick some and his point is valid.

Resident evil 2 @ 4k will slam towards 8gb without hesitation and its visible to see the stuttering and freezing. Same goes for Tombraider RotTR (might be the second one) that also showed horrid stuttering and constant VRAM usage at 8gb. Oh and FF15 will easily consume 10gb if you let it. So yes, there are examples of a 2080 being inadequate. It does show Nvidia colours though, because the cost of the 2080 and having a small selection of top titles unable to cope is unacceptable.

its so dependant on the game though. Some utilise 8gb instantly but not use it, however its there if needed. You also have some games that provide 4k Texture downloads, which can easily hammer a 2080.

8GB is not as future proof as it used to be, but its got many some years before its "not enough" for anything but low end GPU specs
 
Last edited:
In how many games? Of the hundreds of popular titles being played right now, how many categorically need more than 8GB of VRAM at 4k? I can count how many on one hand I believe. But I can't count how many games are being played right now on one hand. And how many people are actually playing at 4k? And how many people playing at 4k are cranking all the settings that high? The number dwindles and dwindles.

You're taking a handful of games and extrapolating that to mean everything. I don't think 8GB is an ideal amount, and that's possibly RDNA2's shining dagger against the 3070 and 3080. But I don't see what you see, and I don't see nearly evidence to persuade me otherwise. I explained to you how consoles aren't always the dictator that you presume, that the PS4 and Xbox One had 8GB of total RAM available and yet 4GB cards are only JUST becoming problematic now six years after the consoles were released. I could be wrong, but that's evidence to your idea.

I'm not saying that 8GB is absolutely enough. I don't think it's ideal at all. I wouldn't buy an 8GB card right now with performance levels at a 2080Ti. But I'm applying your logic and waiting for facts. What you're telling me is not a fact. It's your extrapolation, just like mine. If I see regular titles tanking the 3070's VRAM capacity within the next year, your extrapolations were correct. If I don't, your extrapolations were incorrect. No big deal. We're all just throwing ideas around for the fun of it after all.

There are loads of games that go over the limit. SOTTR is another, heck even Shadow Of Mordor uses 10.

It's not just one game fella. But even if it were then you can not declare a product a 4k card. Do you see what I mean? if it can't do 4k in everything it's not 4k.

If you are considering 4k now or in the future do not buy a 3070. That's like saying "oh well the 2070 could do 4k and the 1070 before it". No, no they couldn't.

It's a 70 card for a reason. Try and keep grounded and remember, these prices are not great they are just back to 10 series prices and the top end card will still run you £1300.

Nvidia said it's as fast as a 2080Ti, yet on every single thing they said so far they have been caught out and it's been debunked.

I know these times can be very exciting (I am, believe it or not) but at the same time I am a realist and I know what Nvidia have been through with Ampere. The first rumour debunked was "Twice the performance per watt" which is total bull. Ampere is *terrible* per watt.

"Twice the performance per watt, on average (a word that never exists with an absolute) in some games we cherry picked knowing one card would run out of VRAM".

I dont buy it. They are just playing good cop bad cop. Turing prices were bad cop, then all of a sudden they release their cheaper cards (because they were, fact) and drop their prices back to Pascal levels, tell people they are X fast and every one loses their freakin minds.

You know the saddest part? the marketing will go into the reviews and the reviewers will make these products look better than they really are.

https://www.tweaktown.com/image.php...idia-titan-rtx-vs-amd-radeon-vii-showdown.png
 
Last edited:
It's not just one game fella. But even if it were then you can not declare a product a 4k card. Do you see what I mean? if it can't do 4k in everything it's not 4k.

It does show Nvidia colours though, because the cost of the 2080 and having a small selection of top titles unable to cope is unacceptable.

I don't agree with these arguments.

When GTA V came out, the main big boy of the time was the 980Ti. It was marketed and used very happily as a 1440p capable GPU. 90+% of games ran very well at 1440p with a 980Ti, including GTA V. But in the settings menu of GTA V there was a sub-menu, and if you cranked those settings and drove to a grassy section in the map, and there were many of them, performance tanked to 30 FPS or lower. Does that mean the 980Ti was incapable of running 1440p gaming? Same applies to AC Unity, Arkham Knight, and many more games.

That's one example of a card being perfectly capable of running 1440p admirably, yet struggling in certain extreme, and very rare, circumstances. We've seen this for years and years and yet no one claimed the 980Ti was "unacceptable" at 1440p, because within reason it was absolutely fine. That was 'the' rig to have at the time. It was the ASUS 1440p/144Hz monitor with an aftermarket 980Ti. Groundbreaking gaming for many people. People had an absolute blast.

The 3070 is potentially a 4k capable card at £400+ for 90+% of games. That's groundbreaking gaming for many people. People could have a blast.

Do I think everyone should buy a 3070 for 4k gaming? No.


I know these times can be very exciting (I am, believe it or not) but at the same time I am a realist and I know what Nvidia have been through with Ampere. The first rumour debunked was "Twice the performance per watt" which is total bull. Ampere is *terrible* per watt.

"Twice the performance per watt, on average (a word that never exists with an absolute) in some games we cherry picked knowing one card would run out of VRAM".

I dont buy it. They are just playing good cop bad cop. Turing prices were bad cop, then all of a sudden they release their cheaper cards (because they were, fact) and drop their prices back to Pascal levels, tell people they are X fast and every one loses their freakin minds.

You know the saddest part? the marketing will go into the reviews and the reviewers will make these products look better than they really are.

https://www.tweaktown.com/image.php...idia-titan-rtx-vs-amd-radeon-vii-showdown.png

Where or when have I given you the impression that I have an unrealistically enlarged view of Ampere? The prices surprised me in a good way. That's about it. I do see many people being blinded by the prices, but I also see a lot of scepticism and negativity. Benchmarks are leaking out showing the 3080 around 50% faster than a 2060 Super. The GTX 1080 at launch was 52% faster than a GTX 970, Maxwell's equivalent to Turing.

I don't believe the 3070 to be an ideal 4k for everyone out there. I believe it's 4k capable for a certain demographic. That's exciting to me, because I'm not a catastrophiser and I don't want exaggerate. Neither do I want to dictate that everyone should have the absolute best or go home crying into their GTX 760 pillows. I want gamers with £450 to play certain games at 4k, and it looks like both Nvidia and AMD are going to be offering that. I'd recommend AMD over Nvidia, but I'm not blind to a very large reality: the 3070 might be a 4k capable card for reasonable money... with a caveat. Nvidia might try to pull the wool over people's eyes, but we are each responsible for ourselves. All it takes is five minutes of Googling to see that a handful of games (big games) draw more than 8GB of VRAM. If you play those games or want to keep the 3070 for five years, don't buy it, it's not for you. But if you're in a different category, the 3070 might be perfectly fine.
 
Just to put my 2 cents in.

The 8GB of memory is fine. As of now and the next year or two it'll be fine. Once developers start drifting away from mult gen releases and focusing on the next gen(which then will be Curren gen) or if that's confusing, PS5, Series S/x,PC is when we may see 8GB of VRAM start to be a hindering factor at 1440/4k depending on asset resolutions used and game sizes.

For the vast majority of people it'll be fine for a long time. 1080p gaming isn't going to vanish anytime soon. Especially as people aim for 360hz and above frame rates.
 
https://www.youtube.com/watch?v=RYV4muLkbss&feature=emb_logo

60% improved perf/w with most of that from architecture, not process

Not a 512 bit bus

128mb of 'infinity cache' to make up for lack of bandwidth

No HBM on consumer cards

Clocks speeds up to the 'console level', ie implies 2.3ghz

80CUs for top SKU

6700 - 3070 level

6800 - around 3080

6900 - faster than 3080 and may 'nip at the heels' of the 3090

AMD considered undercutting NV but may price the same

AMD are 'very competitive, this is not Vega 64'
 
I don't agree with these arguments.

When GTA V came out, the main big boy of the time was the 980Ti. It was marketed and used very happily as a 1440p capable GPU. 90+% of games ran very well at 1440p with a 980Ti, including GTA V. But in the settings menu of GTA V there was a sub-menu, and if you cranked those settings and drove to a grassy section in the map, and there were many of them, performance tanked to 30 FPS or lower. Does that mean the 980Ti was incapable of running 1440p gaming? Same applies to AC Unity, Arkham Knight, and many more games.

That's one example of a card being perfectly capable of running 1440p admirably, yet struggling in certain extreme, and very rare, circumstances. We've seen this for years and years and yet no one claimed the 980Ti was "unacceptable" at 1440p, because within reason it was absolutely fine. That was 'the' rig to have at the time. It was the ASUS 1440p/144Hz monitor with an aftermarket 980Ti. Groundbreaking gaming for many people. People had an absolute blast.

At the price of the Ti and other Tis I expect to be able to crank all of the settings to the max, yes. I don't think that is unrealistic. I had a Titan XM and everything I played could be cranked to 1440p max settings and it handled it fine. Then again you know me and I am quite happy with a 30 FPS min, so long as I am using adaptive sync (so Freesync and Gsync) IE, without the choppy.

4k to me at least did not become a reality until the Titan XP and 1080Ti came out, but once again it was a fast food snack. Within 6 months new games came along and there went your 4k settings cranked.

Like I say, on a £1300 GPU I don't see the point.



The 3070 is potentially a 4k capable card at £400+ for 90+% of games. That's groundbreaking gaming for many people. People could have a blast.

Do I think everyone should buy a 3070 for 4k gaming? No.

Where or when have I given you the impression that I have an unrealistically enlarged view of Ampere? The prices surprised me in a good way. That's about it. I do see many people being blinded by the prices, but I also see a lot of scepticism and negativity. Benchmarks are leaking out showing the 3080 around 50% faster than a 2060 Super. The GTX 1080 at launch was 52% faster than a GTX 970, Maxwell's equivalent to Turing.

I don't believe the 3070 to be an ideal 4k for everyone out there. I believe it's 4k capable for a certain demographic. That's exciting to me, because I'm not a catastrophiser and I don't want exaggerate. Neither do I want to dictate that everyone should have the absolute best or go home crying into their GTX 760 pillows. I want gamers with £450 to play certain games at 4k, and it looks like both Nvidia and AMD are going to be offering that. I'd recommend AMD over Nvidia, but I'm not blind to a very large reality: the 3070 might be a 4k capable card for reasonable money... with a caveat. Nvidia might try to pull the wool over people's eyes, but we are each responsible for ourselves. All it takes is five minutes of Googling to see that a handful of games (big games) draw more than 8GB of VRAM. If you play those games or want to keep the 3070 for five years, don't buy it, it's not for you. But if you're in a different category, the 3070 might be perfectly fine.

Right I will try to cover as much of that as possible.

Firstly with regards to the 3070. We are now entering a "4k or GTFO" generation, IMO. If the consoles can do it then you should expect a GPU, *any* GPU that costs more than said console to do the same at the same settings. Only we know that with the advent of the XB1X that flipped on its head. All of a sudden a PC costing twice as much did the same thing. THAT is why I was so very impressed with the XB1X, and why I bought two. Because I had been chasing that dragon on PC for years and spunked many of thousands of pounds down the bog.

However, as I said, gamers are going to expect it now. And if a card does not fully meet the requirements and costs more than the console? like I said, bad card. I don't care what people do with their money that's their lookout fella. I just don't like BS, and this whole 30 series release has been the back end of a male cow. As more days pass and more and more comes to light the more we find out how "not anywhere near as good as it sounds" Ampere is. It's basically the hog it was predicted to be.

I didn't assume that you have an enlarged view of Ampere. After the announcement and the horror show since (sub £500 2080Tis any one?) I just wanted to remind people that Jen has form for talking crap. That's all. He uses the typical "Up to" BS.

As for me being a "catastrophiser?" no. I am a realist.

Not only am I a realist but I have an awful, awful lot of experience of running 4k. That, given my total disbelief of how the 3070 is going to perform (on the same VRAM and 3gb less of it than the 2080Ti !) is why I am telling you to be careful.

Today it has come to light that the 2080Ti is 24% slower in real world situations than the 3080. That means the 3080 is 32% odd faster while consuming nearly 30% more power.

If you really thought the 3070 would match that? IDK. I know you are not that silly. However, even if it does match it it doesn't have enough VRAM.

The "best" RT game so far apparently is Control. Imagine buying one, wanting to try out this supposed best RT game and then finding yourself running out of VRAM?

And, the price sounds too good to be true because it near certainly will be. So it's "cheaper" than Turing right? yeah it is. However, that doesn't make it cheap ! it still costs more than the XBSeX. Which apparently will run every single game that is coded for it at 4k.

And then of course, AMD have not had their go yet. Whilst every one has discounted them? like I said to Dice yesterday if they had a blinder and Nvidia as we know had a crap day at the office? (320w TDP GPUs ORLY?) then that could change fast.
 
I personally couldn't care about 4k atm but I don't agree that the new consoles will be able to out perform PC's, they have less ram it's shared, slower CPU's less cores and compare most any console game that comes to PC it sure looks a lot better on PC.

So 4k console is not 4k PC they are not the same beast at all.

I guess you can take into account the optimisations that consoles benefit from but they still are not the same other than a bad console port is a bad port.

Sure the gap has gotten closer this gen but it will widen over the coming years.

I feel AMD could really come out with something very competitive the only issue most of us have is we've waited years and still no really solid info on it, leaks are all over the place be it from Nvidia or AMD it's been an utter mess just like 2020 in general.

I think there is something that suits all peoples needs and people that get overly hype'd on anything you can't help them they wont wait they will lap up the marketing and get what there mind tells them they want.

I'd say i've looked forward to the new cards and kept up with different leaks and drizzles of info waiting and pretty excited, but it's more that I want to play some of the coming games and games I haven't started on with something decent not my 970's sure they do ok but my main game Black desert Online is amazing to look at for an MMO but to keep my gameplay smooth I just put it all on low 45fps is not enough when I'm playing it fully at max settings.

I really want AMD to pull it back im less fussed about some things but they need to have done better than just performance the rest needs to be right as well otherwise I will go for a 3070/80 but I'm waiting in hope.

It dawn'd on me how much negativity AMD got on their post when speaking about dates, there is a large part of the gaming base against them thinking the worst not seen it like this so harshly before, but it made me realise people can get things very wrong at times and maybe this is one of them times when people will get it dead wrong.

So I wait in hope and happy to chat, everyones viewpoint is valid but generally things can come across as overblown and over hype'd and taken the wrong way, so i might not agree fully with all points but I won't bash anyone for having there own mindset.

The buffer cache the rdna2 might have could be a rather nice thing but the question i have is it there to solve a problem or is it really something that improves the issues as a whole.

I want a lot of info from all sides and when the reviews start happening I'll be very busy watching many of them, I guess it's lucky i've been broke until now that im able to save properly or i could have just gone nuts and jumped the gun on a 3080 but I'm actually glad I can't have one just yet :D



I sure hope it's a new model, but man the marketing back then was so much more intresting the adverts we used to get were like the sega ads cyberrazorcut style thing and breast laiden pixels :P
 
Last edited:
Back
Top