Go Back   OC3D Forums > [OC3D] General Forums > OC3D News
Reply
 
Thread Tools Display Modes
 
  #31  
Old 29-10-20, 01:44 PM
Kleptobot's Avatar
Kleptobot Kleptobot is offline
Advanced Member
 
Join Date: Dec 2012
Location: Melbourne Australia
Posts: 357
These gpu's are being produced on a very mature node, with a well established relationship between the two companies. The bus width is only 256 bit. I'm not sure how much complexity infinity cache adds but, I wager the yields will be pretty good for these dies. Using gddr 6 which is cheaper and considering the design familirty I'd say AMD have got power delivery pretty sorted too. Robust power delivery means no complex expensive pcb designs.

I think AMD stand to make good profits from this line up if the performance is there. They have priced it competitively but probably have room to drop prices if Nvidia try something.

Sorry if the post is a bit incoherent, it was a bit of a stream of consciousness.

__________________
delided 3570k (watercooled)
hd7970 (watercooled)
8GB Gskill XM
3*140 SR1 Rad
all stuffed in an FD arc midi
Reply With Quote
  #32  
Old 29-10-20, 01:57 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by AngryGoldfish View Post
Remember that Steve also showed a benchmark at 4k with the 3070 in Doom Eternal using Ultra texture settings rather than Ultra Nightmare, and performance issues (which were still only minor really) disappeared. I recognise know you have a particular view on this, but going by how many people cannot tell the difference visually between the highest texture setting and one or two notches below in Doom Eternal, I think the 3070 is perfectly fine for 4k if you're willing to make one or two insignificant sacrifices. And given that the 3070 is not actually capable of running all games at max settings at 60 FPS, I think it's unreasonable to say the 3070 is not a 4k card because of VRAM and would be if it had 2GB more, when ignoring the raw compute power and its inherent limitations. We've established you're not willing to make any sacrifices though when it comes to VRAM, and that's fine. I would be and a lot of others would be too. So I'd say the 3070 is a 4k-capable card for those who are OK to make a few small compromises. If you're not, it's not the card for you.

Then again, I think you're right about the 6800 being better at 4k and therefore better value. It performs better, has more VRAM, and more bandwidth. If I gamed at 4k or was going to and didn't have a huge budget, I'd definitely favour the 6800 over the 3070. I'd still prefer the 6800XT, but if the 6800 was already on the edge of my max budget, I'd still squeeze for the 6800 over the 3070, for all the reasons you gave.

The 3070 is probably the card I'd get for 1440p gaming though, because the 6800 doesn't offer enough features or value over it to make it worth it. Or the 6700XT or 6700 might be even better suited for 1440p, because the 3070 is arguably OP at that resolution for a guy like me who doesn't care about cranking every setting to the max in every single game (even though I don't play every single game).
The 6800 by all accounts seems to be quite a bit faster than the 2080Ti in the benchmarks shown at least. The 3070? is total level peggings at 1440p, falls off at 4k. The 6800 won't do that though.

Honestly to make it shorter? 10gb. 10gb would have been the sweet spot for the 3070 and I would have nothing at all to complain about.

It's not just Doom btw.

Even Wolfenstein 2 uses more than 8gb at 4k. So does Shadow Of Mordor and that is years old now.

It won't get better mate. Game devs now have double the VRAM for console games, what do you think that will do for PC gamers? most probably it will double the price of entry (figuratively speaking of course).
__________________


If you don't like what I post don't read it.
Reply With Quote
  #33  
Old 29-10-20, 04:35 PM
AngryGoldfish's Avatar
AngryGoldfish AngryGoldfish is offline
Old N Gold
 
Join Date: Jan 2015
Location: Ireland
Posts: 3,024
Quote:
Originally Posted by AlienALX View Post
The 6800 by all accounts seems to be quite a bit faster than the 2080Ti in the benchmarks shown at least. The 3070? is total level peggings at 1440p, falls off at 4k. The 6800 won't do that though.

Honestly to make it shorter? 10gb. 10gb would have been the sweet spot for the 3070 and I would have nothing at all to complain about.

It's not just Doom btw.

Even Wolfenstein 2 uses more than 8gb at 4k. So does Shadow Of Mordor and that is years old now.

It won't get better mate. Game devs now have double the VRAM for console games, what do you think that will do for PC gamers? most probably it will double the price of entry (figuratively speaking of course).
You're just repeating the same arguments at this point.

Saying it's "not just Doom" doesn't mean it's the majority or even minority. Are you really basing your purchases on 0.1% of the gaming catalogue? If you are, that's cool. But why presume everyone else will? It ignores the fact that consumers can choose to turn one or two superfluous settings down to be within their VRAM limit, as you may have to do in some games at 4k anyway. You act as if people are buying SLI GTX 770's for modern games. You're taking a small handful of adjustable instances and blowing them out of proportion.

The console argument is just your guess. I've explained in another post a few weeks ago what happened with the previous generation. When the PS4 and Xbox One came out with its 8GB of collective VRAM, what happened? Did all games suddenly jump to 5-6GB of VRAM utilisation? No, even at 4k. For years, literally years, games still only needed 3-4GB of optimally even at 4k. Benchmarks across the board PROVE this. It wasn't until very recently that games actually started to draw upwards of 8GB of VRAM. The consoles had more VRAM than most PC gamers, yet games still ran perfectly well on 4GB cards. There were a small handful of outliers, and in those cases you could just drop the textures slightly and see no visible difference and gain an extra 5-10% performance.

And that's a whole other point, the performance you lose by not being in the sweet spot of VRAM is not huge. In some cases, such as at 4k in one or two titles, the limited VRAM of cards like the Fury X and GTX 980 caused an unpleasant experience. But again, outliers. I don't base my decisions on the 0.1%, especially when I can reduce that percentage even further by reducing one or two superfluous settings. And not everyone even plays Wolfenstein or Doom Eternal! You buy the card that suits your needs, not what someone on the Internet tells you to buy for reasons that impact them and them alone.

I guess I'm repeating myself as well at this stage.

That said, I do think VRAM usage will increase over the next two years, more than the increase from the previous gen consoles. I just don't agree with your gloomy outlook. I've seen very little evidence to feel like a 3070 will become obsolete within a year. Seeing as it's better suited to 1440p already, if there are one or titles with higher VRAM amounts, just do what PC gamers have been doing for 30 years and play the game settings until you find your sweet spot. That's what they're there for.
__________________
ASUS X370 Crosshair VI Hero ⁞⁞ Ryzen 1600X 4Ghz ⁞⁞ Thermalright Le Grand Macho RT ⁞⁞ Aorus GTX 1080 11Gbps ⁞⁞ G.Skill TridentZ 3200Mhz
Jonsbo W2 ⁞⁞ Corsair AX760 ⁞⁞ Pexon PC ⁞⁞ Samsung 960 EVO 250GB & 850 EVO 500GB
⁞⁞ Western Digital 1TB Blue & 3TB Green
BenQ XL2730Z ⁞⁞ Mixonix Naos 7000 ⁞⁞ Corsair K70 Cherry MX Brown ⁞⁞ Audio-GD NFB-15 ⁞⁞ EVE SC205 ⁞⁞ AKG K7XX
Reply With Quote
  #34  
Old 29-10-20, 04:47 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by AngryGoldfish View Post
You're just repeating the same arguments at this point.

Saying it's "not just Doom" doesn't mean it's the majority or even minority. Are you really basing your purchases on 0.1% of the gaming catalogue? If you are, that's cool. But why presume everyone else will? It ignores the fact that consumers can choose to turn one or two superfluous settings down to be within their VRAM limit, as you may have to do in some games at 4k anyway. You act as if people are buying SLI GTX 770's for modern games. You're taking a small handful of adjustable instances and blowing them out of proportion.
Then my viewpoint is buy a console.

As for whether I would be bothered about a game I absolutely do play all of the time? yes, yes I would be bothered about it. Especially given the 3070 costs £550 at the time of typing this for a pre order, the rest of the kit costs about a grand (including chair, monitor etc) so yeah, I would not want to compromise. That is why I game on a PC, so I don't have to compromise. When I do? I sit on the sofa and play Xbox.

As for blowing it out of proportion? oh man. If you had been burned like I have on cards that were supposedly absolutely fine and had more than enough VRAM you would get it. That is what I am trying to stop people from doing.

I could sit here and go on about it all day. 8gb? the 1070 had 8gb years ago. Why have they seen fit not to upgrade that? etc etc yada ya.

It's one of those things I guess you need to experience and be burned by to p*ss you off enough to consider it an important factor.

But at £100 more than the cost of an entire Xbox SX I would not want to compromise thank you no.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #35  
Old 29-10-20, 05:27 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 2,147
If you can get away with using a console then great alien, go and do it. But telling other people to buy a console when they need a GPU is like telling someone who needs a new truck to go and buy a Vauxhaul Corsa because it does everything you personally want anyway and costs less and is easier to maintain.

If you get wee'ed off at turning down settings and that makes you feel burn't then fine, base your decisions on that and buy devices that have all the settings already turned down for you, but it means your advice on this topic is irrelevant to 99% of PC gamers who do this day in day out, most PC gamers don't spend over £200 on their GPU, and still enjoy their experience.
Reply With Quote
  #36  
Old 29-10-20, 06:00 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by tgrech View Post
If you can get away with using a console then great alien, go and do it. But telling other people to buy a console when they need a GPU is like telling someone who needs a new truck to go and buy a Vauxhaul Corsa because it does everything you personally want anyway and costs less and is easier to maintain.

If you get wee'ed off at turning down settings and that makes you feel burn't then fine, base your decisions on that and buy devices that have all the settings already turned down for you, but it means your advice on this topic is irrelevant to 99% of PC gamers who do this day in day out, most PC gamers don't spend over £200 on their GPU, and still enjoy their experience.
Firstly people don't "need" a GPU. If they do they have issues I can't solve.

The new consoles are set to be 4k. Meaning that they will offer a very similar experience to a GPU costing more. With reduced settings of course. This isn't about cars.

As for turning down settings? I tried that on my Fury X. It didn't work. The textures at 4k were simply too large to fit the VRAM buffer so the card would black screen. I could have dropped the res of course. However after spending £1200 on two I think that would have been a bit crap, given I had a 4k monitor at the time.

I don't think my advice is irrelevant, but thanks any way. People are free to do exactly what they wish.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #37  
Old 29-10-20, 06:23 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 2,147
Quote:
Originally Posted by AlienALX View Post
Firstly people don't "need" a GPU. If they do they have issues I can't solve.
Come on man, seriously? In this age of WFH? My point is not that your advice won't apply to anyone, it's that everyones situation is different, it's that you can give guidance without telling people they should take a wildly different route to the point of likely irrelevance to them.

Telling other people they don't need a GPU is like telling other people they don't need a certain type of screwdriver, how can you know what tools they need to make their bread? There are so many fields that benefit from GPGPU now. How do you even know that updating their home PC that they already own games isn't a cheaper route than buying into a new ecosystem? For sure give guidance and advice, but saying "Just buy a console" genuinely doesn't help anyone, in the same way telling someone wanting to buy a gaming laptop to buy a desktop PC instead is useless advice, and repeating it over and over again in several threads doesn't help any more.

To be honest at this point it doesn't really look like you're giving advice at all, especially given where you're saying it, it just seems like a load of Choice-Supportive Bias influenced rants.
Reply With Quote
  #38  
Old 29-10-20, 07:48 PM
AngryGoldfish's Avatar
AngryGoldfish AngryGoldfish is offline
Old N Gold
 
Join Date: Jan 2015
Location: Ireland
Posts: 3,024
Quote:
Originally Posted by AlienALX View Post
As for turning down settings? I tried that on my Fury X. It didn't work. The textures at 4k were simply too large to fit the VRAM buffer so the card would black screen. I could have dropped the res of course. However after spending £1200 on two I think that would have been a bit crap, given I had a 4k monitor at the time.
Maybe this is where all this comes from. The Fury X really was VRAM limited, and those that were convinced by AMD's legitimate spiel must have been peeved off. I don't wanna be a peehead and rant and rave about the same thing. I'll be honest, it drives me bonkers seeing the same arguments repeated over and over again when they don't make sense. But maybe this isn't even about that; maybe it's more about how AMD messed up big time and put a great deal of consumers in a difficult position, all because of VRAM.
__________________
ASUS X370 Crosshair VI Hero ⁞⁞ Ryzen 1600X 4Ghz ⁞⁞ Thermalright Le Grand Macho RT ⁞⁞ Aorus GTX 1080 11Gbps ⁞⁞ G.Skill TridentZ 3200Mhz
Jonsbo W2 ⁞⁞ Corsair AX760 ⁞⁞ Pexon PC ⁞⁞ Samsung 960 EVO 250GB & 850 EVO 500GB
⁞⁞ Western Digital 1TB Blue & 3TB Green
BenQ XL2730Z ⁞⁞ Mixonix Naos 7000 ⁞⁞ Corsair K70 Cherry MX Brown ⁞⁞ Audio-GD NFB-15 ⁞⁞ EVE SC205 ⁞⁞ AKG K7XX
Reply With Quote
  #39  
Old 29-10-20, 08:40 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by AngryGoldfish View Post
Maybe this is where all this comes from. The Fury X really was VRAM limited, and those that were convinced by AMD's legitimate spiel must have been peeved off. I don't wanna be a peehead and rant and rave about the same thing. I'll be honest, it drives me bonkers seeing the same arguments repeated over and over again when they don't make sense. But maybe this isn't even about that; maybe it's more about how AMD messed up big time and put a great deal of consumers in a difficult position, all because of VRAM.
Actually that is not where it all comes from. It happened to me with the GTX 470 also. A card I retired for no other reason than it not having enough VRAM.

I'm not going to keep doing this to death. I am allowed, nay, entitled my opinion and that's it. I didn't anywhere say that anyone should listen to me, nor do I care if they do or don't. It seems to be an opinion shared by many reviewers, but it was mine before they even said anything. We are about to take a big leap in gaming for many reasons, and I just don't think it will be enough for 4k which is what this card should be based on the tier level of performance. I don't find that unreasonable.

I also feel that as I type this (again with no crystal ball) that it's overkill for 1440p. That's all. I am allowed that opinion too.

I have often said I would rather have seen this card with more VRAM at a higher price. I also retain that opinion too. Probably because I don't often upgrade my GPUs, so even as a paid for safety net I would have preferred it to be there. Again, I am entitled to feel that way.

As for AMD? tbh? it wasn't AMD who made those claims. It was every one with their opinions. And, their opinions turned out to be wrong. No, being faster (the memory itself) does not mean it can break the physics of how games load into and out of VRAM and make magic things happen. That is the crap being spread around at the time by reviewers. "It's more than enough because it's so fast !". Well guess what? that opinion was wrong.

TBH? if people didn't continue arguing with me over it then I would stop posting about it. It just seems that every one (not including you in this) just wants to be right when tbh? they have no idea whatsoever what the future holds. And it is because of this that I find the stingy amount of VRAM this card possesses to be just that, stingy. Especially when going back a few generations the cards had the same.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #40  
Old 29-10-20, 09:47 PM
looz's Avatar
looz looz is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Finland
Posts: 2,012
Quote:
Originally Posted by AlienALX View Post
I'm not going to spend hours and hours going through it all again.
__________________
i7 8700k - 16GB - 2060 FE - 660p 1TB + MX500 2TB - HE-4XX w/ Topping D30+A30
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 12:49 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2020, vBulletin Solutions, Inc.