WYP
News Guru
Last edited:
Looking like the 6800XT for me unless the price conversion sucks, but tbh the prices were a little bit off from my expectations but not massively, but the 6800XT seems to be the best pick of the 3 for me, it was at least spot on with what i expected.
The price of the 6800 is too damn high!
Pretty hot take to call the 3070, which can maintain 4k60 at high settings in most modern games, "not a 4K card".
I have a feeling you will.I'm not going to spend hours and hours going through it all again.
It is enough, but they'd rather have you pay the premium for 3080.8gb is not even enough now, and you can't even get one yet. What do you think will happen when devs get access to double the amount of VRAM they have now?
But the simplest way of putting it? why was it not marketed as a 4k card? the reviewer's guide is for 1440p which shows it in the best light. There's a reason for that.
the VRAM allocation makes most sense for 1440p gaming. 8GB of GDDR6 memory is fine for 1440p ultra settings right now, thought at 4K we are starting to see one or two titles eat more than that.
https://www.kitguru.net/components/...s/nvidia-rtx-3070-founders-edition-review/31/
Now we can argue about the nature and choice of an 8GB GDDR6 framebuffer, but in most use cases, it's enough, and we understand the choice made here; NVIDIA needs to keep that bill of materials healthy, as otherwise, this 499 USD card would have easily been 650 USD based on that 16GB decision. With future games in mind, this will turn out to become a WQHD (2560x1440) resolution card
https://www.guru3d.com/articles_pages/geforce_rtx_3070_founder_review,32.html
At the end of the day, there is only one topic to be viewed critically with the GeForce RTX 3070: the memory expansion. At 8 GB, it is on par with the GeForce GTX 1070, which was released four years ago in June 2016. As of today, 8 GB for WQHD and thus the primary resolution of the GeForce RTX 3070 is usually still enough, only Ghost Recon Breakpoint wants more in this resolution on the course. But after four years at this level and next-gen consoles on the doorstep, the likelihood that more memory will be useful in the future for the highest and highest details in WQHD has increased significantly at the end of 2020. This is even more true for UHD.
Apart from the memory, the GeForce RTX 3070 does everything right
https://www.computerbase.de/2020-10/nvidia-geforce-rtx-3070-test/4/#abschnitt_fazit
should you ever feel VRAM is running out, just sell the RTX 3070 and buy whatever card is right at that time
https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/41.html
The only potential gotcha is the card’s 8GB of memory. For the vast majority of games available today, 8GB should be adequate with maximum image quality, even at high resolutions, but moving forward that 8GB of memory may require some image quality concessions to maintain smooth framerates.
https://hothardware.com/reviews/nvidia-geforce-gtx-3070-ampere-gpu-review?page=4
you only have 8GB on the RTX 3070. But this is enough for the current and even next-gen games coming if you're only playing at 1080p, 1440p, and 3440 x 1440
https://www.tweaktown.com/reviews/9...unders-edition/index.html#Whats-Hot-Whats-Not
Some might have wanted to see more than 8GB of GDDR6 memory on the GeForce RTX 3070, but that shouldn’t be an issue on current game titles for 1440P gaming. If a game comes out in the future that needs more than 8GB for ultra image quality settings then the solution would be to just change the game settings
https://www.legitreviews.com/nvidia-geforce-rtx-3070-founders-edition-review_222986/15
the amount of memory of the RTX 3070 is also a point of discussion, because in 2020 the release of a video card of more than 500 euros that with eight gigabytes has the same amount of video memory as its two predecessors makes us somewhat frown
https://tweakers.net/reviews/8274/n...-edition-high-end-prestaties-voor-minder.html
Nvidia never tires of emphasizing that the storage capacities of the RTX-30 graphics cards were chosen deliberately, as they are sufficient . Our observations and measurements say otherwise - we don't look at the margin, after all. In fact, the Geforce RTX 3070 8GB shows symptoms of memory deficiency a little more often in the WQHD intended for it than the Geforce RTX 3080 10GB in Ultra HD. Both models are supported by gracious streaming systems, which address the graphics memory up to a percentage upper limit and also simply omit texture details. We are really not telling loyal PCGH readers anything new hereLet me tell everyone else: What is still enough today may be too little tomorrow. We are about to launch a new generation of consoles, which will bring new multiplatform games with new graphics and new standards. Although new technologies such as Variable Rate Shading and DirectStorage are in the starting blocks to increase efficiency, experience shows that it will take a long time for these ideas to penetrate the market. We are talking about years, not months. On the other hand, there is the fact that most developers do not have the time to optimize their work for months. There will always be games that literally run like nuts. PC gamers like to deal with this problem with strong hardware - but the Geforce RTX 3070 lacks this buffer. Memory hogs like Horizon: Zero Dawn, Ghost Recon Breakpoint, Wolfenstein Youngblood and a few others are already showing where the journey is headed. Ray tracing makes things worse due to the increased memory requirement.
https://www.pcgameshardware.de/Gefo...-Ti-Release-Benchmark-Review-Preis-1359987/4/
Best one for me was tpu, just punt it when it runs oot!![]()
But who in the real world actually forces themselves to play all games at max settings?