Go Back   OC3D Forums > [OC3D] General Forums > OC3D News
Reply
 
Thread Tools Display Modes
 
  #11  
Old 28-10-20, 06:32 PM
looz's Avatar
looz looz is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Finland
Posts: 2,012
I think AMD has plenty of wiggle room to counter possible Super series with a price drop, since for example they're using cheaper GDDR6. Though there's a possibility that 128MB L3 is fairly expensive to etch, etc.

What will also be interesting, and we certainly won't hear from AMD about it, is performance through PCI-E 3.0 link. We do know that the Navi cards with only 8 PCI-E lanes suffered greatly on PCI-E 3.0, and it's not unimaginable that these much more powerful cards would face the same issues even at 16x.

I hope their driver stack is rock solid now, though, and since the same architecture is used in the consoles, I have higher hopes of them achieving that.

Currently it does seem that 6800 XT is great for higher refresh rate 1440p, where DLSS is less relevant, and 3080 maybe has the edge overall at 4K.

@MacLeod Your better experience with G-Sync is explained by the stricter requirements for the badge. There's plenty of great FreeSync monitors, but garbage hardware can also wear the badge without issues as long as they stick to minimum specs. So studying reviews is required (though IMO buying anything expensive should be thoroughly researched).

__________________
i7 8700k - 16GB - 2060 FE - 660p 1TB + MX500 2TB - HE-4XX w/ Topping D30+A30
Reply With Quote
  #12  
Old 28-10-20, 06:36 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by AngryGoldfish View Post
The price of the 6800 is too damn high!
I don't think so. I gave the short answer before but the longer answer is more in depth.

Firstly it obviously has double the amount of VRAM, which is the same. So that costs money. However, as noted yesterday by Hardware Unboxed the 3070 performs up to 38% slower than the 3080. This is at 4k. However, the 4k results are hampered by VRAM limitations. Either the VRAM isn't enough (see Doom Eternal) or it's not got enough bandwidth for 4k. When not hampered by those issues the 3070 is only 27% slower than the 3080.

Which basically means that the 3070 is not a 4k card. It costs (just checked the only source I can) £529. So if Nvidia were meeting their MSRP? I would likely agree with you, the 6800 costs too much. However, I find £529 for a 1440p card (because it is not a true 4k card as can be seen by Nvidia showing it at 1440p in their reviewer's guide) too much. Sure, it is faster than the 2080 Super and costs a bit less, and sure it performs as well as a 2080Ti @ 1440p (but not 4k). However, there were already two fantastic 1440p cards (2070s and 2080s). If you desperately need the extra performance at 1440p right now? it may be worth buying. Buying it as a 4k card though would be very risky.

So that brings us back to the 6800. This will not suffer those issues. It has better memory bandwidth and twice as much. Thus, it *is* a bonafide 4k card. Which the 3070 could have easily been had Nvidia not derped the VRAM. I guess they did not want it hurting their 3080 sales, but we all know how much they cost.

So for me? I would rather have seen the 3070 have at least 10gb VRAM to make it a proper all out 4k card and it cost what the 6800 does. Or in other words? I would rather buy the 6800.

Apparently GN say that the reason it's that expensive is because AMD decided to use SAM on it. To me though? like I say, I just believe it's a proper 4k card where the 3070 is not.

I just wish Nvidia had released it with more VRAM as a proper 4k card. 4k gaming was supposed to be getting much more affordable, and now they have gone and done this. 3080s cost about £800 at the moment, that's no cheaper than many past 4k cards. In fact in the case of the 1080Ti it's considerably more.

I also have a feeling Nvidia will ditch Samsung and go back to TSMC as quick as they can, which concerns me if the drivers need to be different to the Samsung Ampere ones. Either way they can not just continue on with Samsung as if AMD drop prices after launch they will be absolutely screwed.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #13  
Old 28-10-20, 06:47 PM
looz's Avatar
looz looz is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Finland
Posts: 2,012
Pretty hot take to call the 3070, which can maintain 4k60 at high settings in most modern games, "not a 4K card".
__________________
i7 8700k - 16GB - 2060 FE - 660p 1TB + MX500 2TB - HE-4XX w/ Topping D30+A30
Reply With Quote
  #14  
Old 28-10-20, 06:49 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by looz View Post
Pretty hot take to call the 3070, which can maintain 4k60 at high settings in most modern games, "not a 4K card".
I'm not going to spend hours and hours going through it all again.

8gb is not even enough now, and you can't even get one yet. What do you think will happen when devs get access to double the amount of VRAM they have now?

But the simplest way of putting it? why was it not marketed as a 4k card? the reviewer's guide is for 1440p which shows it in the best light. There's a reason for that.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #15  
Old 28-10-20, 06:51 PM
looz's Avatar
looz looz is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Finland
Posts: 2,012
Quote:
Originally Posted by AlienALX View Post
I'm not going to spend hours and hours going through it all again.
I have a feeling you will.
Quote:
Originally Posted by AlienALX View Post
8gb is not even enough now, and you can't even get one yet. What do you think will happen when devs get access to double the amount of VRAM they have now?

But the simplest way of putting it? why was it not marketed as a 4k card? the reviewer's guide is for 1440p which shows it in the best light. There's a reason for that.
It is enough, but they'd rather have you pay the premium for 3080.
__________________
i7 8700k - 16GB - 2060 FE - 660p 1TB + MX500 2TB - HE-4XX w/ Topping D30+A30
Reply With Quote
  #16  
Old 28-10-20, 06:52 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 2,147
4K done via DLSS (IE rendered at 1440p) is no more fake(And arguably less fake) than the tricks used to produce 4K video outputs on many great looking games on the consoles, and seemingly could be more or less equivalent in principle to what the Xbox Series will use, so I wouldn't say 8GB is totally not enough for a good 4K experience, just maybe not enough for a perfect native 4K experience for an extended time period.
Reply With Quote
  #17  
Old 28-10-20, 06:59 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
It's not enough. End of.

Quote:
Originally Posted by tommybhoy, post: 34127277, member: 97583

the VRAM allocation makes most sense for 1440p gaming. 8GB of GDDR6 memory is fine for 1440p ultra settings right now, thought at 4K we are starting to see one or two titles eat more than that.

https://www.kitguru.net/components/g...ion-review/31/

Now we can argue about the nature and choice of an 8GB GDDR6 framebuffer, but in most use cases, it's enough, and we understand the choice made here; NVIDIA needs to keep that bill of materials healthy, as otherwise, this 499 USD card would have easily been 650 USD based on that 16GB decision. With future games in mind, this will turn out to become a WQHD (2560x1440) resolution card


https://www.guru3d.com/articles_page...review,32.html

At the end of the day, there is only one topic to be viewed critically with the GeForce RTX 3070: the memory expansion. At 8 GB, it is on par with the GeForce GTX 1070, which was released four years ago in June 2016. As of today, 8 GB for WQHD and thus the primary resolution of the GeForce RTX 3070 is usually still enough, only Ghost Recon Breakpoint wants more in this resolution on the course. But after four years at this level and next-gen consoles on the doorstep, the likelihood that more memory will be useful in the future for the highest and highest details in WQHD has increased significantly at the end of 2020. This is even more true for UHD.
Apart from the memory, the GeForce RTX 3070 does everything right



https://www.computerbase.de/2020-10/...bschnitt_fazit

should you ever feel VRAM is running out, just sell the RTX 3070 and buy whatever card is right at that time

https://www.techpowerup.com/review/n...dition/41.html

The only potential gotcha is the card’s 8GB of memory. For the vast majority of games available today, 8GB should be adequate with maximum image quality, even at high resolutions, but moving forward that 8GB of memory may require some image quality concessions to maintain smooth framerates.

https://hothardware.com/reviews/nvid...-review?page=4

you only have 8GB on the RTX 3070. But this is enough for the current and even next-gen games coming if you're only playing at 1080p, 1440p, and 3440 x 1440

https://www.tweaktown.com/reviews/96...-Hot-Whats-Not

Some might have wanted to see more than 8GB of GDDR6 memory on the GeForce RTX 3070, but that shouldn’t be an issue on current game titles for 1440P gaming. If a game comes out in the future that needs more than 8GB for ultra image quality settings then the solution would be to just change the game settings

https://www.legitreviews.com/nvidia-...view_222986/15

the amount of memory of the RTX 3070 is also a point of discussion, because in 2020 the release of a video card of more than 500 euros that with eight gigabytes has the same amount of video memory as its two predecessors makes us somewhat frown

https://tweakers.net/reviews/8274/nv...or-minder.html

Nvidia never tires of emphasizing that the storage capacities of the RTX-30 graphics cards were chosen deliberately, as they are sufficient . Our observations and measurements say otherwise - we don't look at the margin, after all. In fact, the Geforce RTX 3070 8GB shows symptoms of memory deficiency a little more often in the WQHD intended for it than the Geforce RTX 3080 10GB in Ultra HD. Both models are supported by gracious streaming systems, which address the graphics memory up to a percentage upper limit and also simply omit texture details. We are really not telling loyal PCGH readers anything new hereLet me tell everyone else: What is still enough today may be too little tomorrow. We are about to launch a new generation of consoles, which will bring new multiplatform games with new graphics and new standards. Although new technologies such as Variable Rate Shading and DirectStorage are in the starting blocks to increase efficiency, experience shows that it will take a long time for these ideas to penetrate the market. We are talking about years, not months. On the other hand, there is the fact that most developers do not have the time to optimize their work for months. There will always be games that literally run like nuts. PC gamers like to deal with this problem with strong hardware - but the Geforce RTX 3070 lacks this buffer. Memory hogs like Horizon: Zero Dawn, Ghost Recon Breakpoint, Wolfenstein Youngblood and a few others are already showing where the journey is headed. Ray tracing makes things worse due to the increased memory requirement.

https://www.pcgameshardware.de/Gefor...eis-1359987/4/




Best one for me was tpu, just punt it when it runs oot!
Quoted from another forum. Nearly every tech press rep disagrees with you, as does HWUB.






14:30.

"Increasing the resolution to 4k removes any and all CPU bottlenecks. And here we can see the 3070 is between 19 and 37% slower than the 3080. Though when not limited by VRAM capacity it's only up to 28% slower"


So there you have it, 24 hours of info in one post.

If you are still a "non believer" then it's like arguing over whether Covid is real and 5g sets fire to your testicles. IE - whatever, mate.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #18  
Old 28-10-20, 07:01 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 2,147
But who in the real world actually forces themselves to play all games at max settings?

Surely "Not a 4K card" means "You can't have a good experience at 4K output res with this card and contemporary games" rather than just "It takes more of a performance hit on 4K than some other over provisioned cards do"
Reply With Quote
  #19  
Old 28-10-20, 07:05 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 15,170
Quote:
Originally Posted by tgrech View Post
But who in the real world actually forces themselves to play all games at max settings?
Right. Then we get into the "Drop the settings !" argument.

However, you do that? may as well buy a XBSX. As it will be doing the exact same thing for 4k. IE, a £529 card with £500 more gear added to it to make it run, then £x for a monitor keyboard, mouse, desk, chair, mouse mat ETC ETC?

dropping settings is compromising. If you are gonna do that? get a bloody console.
__________________


If you don't like what I post don't read it.
Reply With Quote
  #20  
Old 28-10-20, 07:07 PM
tgrech tgrech is offline
OC3D Elite
 
Join Date: Jun 2013
Location: UK
Posts: 2,147
Slightly turning down some badly optimised settings you can't notice the visual effect of anyway (Lets be honest every game has them, usually a few) means you may as well buy a completely different class of device that could very well not meet your requirements out of a PC either in game support, setup, or other uses, and could cost you more than a GPU upgrade for your existing system while causing you to lose all your library?
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 12:18 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2020, vBulletin Solutions, Inc.