Go Back   OC3D Forums > [OC3D] Graphics & Displays > Graphics Cards
Reply
 
Thread Tools Display Modes
 
  #1  
Old 04-09-20, 12:04 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 14,767
Nvidia are liars.

I've noticed the first odd thing. I was watching Hardware Unboxed? is it? I think I just caught Nvidia out.



OK so why am I saying that?

Well firstly we all know that the 1080Ti was hardly any slower than the 2080. In pretty much everything. However, look how much better the 2080 was in Doom Eternal at 1440p. So it was an exception to the rule.

And then, and here is the massive hole in Nvidia's BS benchmarks, look at 4k.

Why is the 2080 losing so badly at 4k to the 1080Ti? that can't be right surely?

Oh it is. It totally is. Want to know why? as per the video I just watched "The 2080 here is running out of VRAM. The game is asking for 9.5gb".

So? there goes your "The 3080 is twice as fast as the 2080 in Doom Eternal at 4k !!!1111oneone". It's a loaded test. Nvidia KNEW the 2080 would run out of VRAM and perform like e.

__________________



If you don't like what I post don't read it.
Reply With Quote
  #2  
Old 04-09-20, 12:15 PM
WYP's Avatar
WYP WYP is online now
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 18,548
TBH, Nvidia are maxing out a game and running it. If their new hard runs better, that is a legitimate test.

Are Nvidia highlighting good examples for their graphics cards? Yes! Are they lying? No!

You can call the benchmarks BS if you like, but they showed like for like gameplay and didn't cheat.

Another thing to note is that VRAM allocation is not the same of VRAM usage. All modern call of Duty games are great examples of this. The games cache everything they can and fill even the largest of VRAM buffers, just because they can.
__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #3  
Old 04-09-20, 12:21 PM
trawetSluaP's Avatar
trawetSluaP trawetSluaP is offline
OC3D Crew
 
Join Date: Apr 2014
Posts: 822
Plus there's Digital Foundry's tests that showed approx an 80% improvement across multiple titles.
__________________
Enthoo Elite | i9-7900X | Rampage VI Extreme | Dominator Platinum 32GB | RTX 2080 FE | 960 Pro 512GB, 840 Evo 250GB x 2 | EVGA SuperNOVA 1200 P2 | HP Omen 27i, MX27AQ | Custom Watercooling Loop + Aquaero 6 XT

Reply With Quote
  #4  
Old 04-09-20, 12:23 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 14,767
Quote:
Originally Posted by WYP View Post
TBH, Nvidia are maxing out a game and running it. If their new hard runs better, that is a legitimate test.

Are Nvidia highlighting good examples for their graphics cards? Yes! Are they lying? No!

You can call the benchmarks BS if you like, but they showed like for like gameplay and didn't cheat.

Another thing to note is that VRAM allocation is not the same of VRAM usage. All modern call of Duty games are great examples of this. The games cache everything they can and fill even the largest of VRAM buffers, just because they can.
It's a clear example of how the 2080 tanks when it runs out of VRAM. If it tanks then IMO it is not a fair test.

That would be why they used the 2080 and not the 2080Ti in those tests.

Call it what you like, it's total BS.

Quote:
Originally Posted by trawetSluaP View Post
Plus there's Digital Foundry's tests that showed approx an 80% improvement across multiple titles.
I'm way ahead of you.

Two of those titles use RT and DLSS. We already know how much better Ampere is at those two. Also, unless a game uses them? it's a worthless test. Other than 5 or 6 games? all of your game collection doesn't use it.

As for BL3? at 4k it barely scraped 30 FPS on a 2080Ti using only 6.5gb of VRAM. It was a pig game then, and it's a pig game now. That is why I bought it for my Xbox.
__________________



If you don't like what I post don't read it.
Reply With Quote
  #5  
Old 04-09-20, 12:31 PM
WYP's Avatar
WYP WYP is online now
News Guru
 
Join Date: Dec 2010
Location: Northern Ireland
Posts: 18,548
Quote:
Originally Posted by AlienALX View Post
It's a clear example of how the 2080 tanks when it runs out of VRAM. If it tanks then IMO it is not a fair test.

That would be why they used the 2080 and not the 2080Ti in those tests.

Call it what you like, it's total BS.
They used an RTX 2080 because it released with the same pricing. It's not a big conspiracy. They want to deliver the message that they deliver a lot more value for money.

They also released a video which had the RTX 2080 Ti VS the RTX 3080.



As always, wait for reviews. What's the point in this conspiracy theory nonsense until you have real data to prove your point?
__________________
_______________________________
Twitter - @WYP_PC
Reply With Quote
  #6  
Old 04-09-20, 12:32 PM
Warchild Warchild is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Norway, Oslo
Posts: 6,929
Quote:
Originally Posted by AlienALX View Post
I've noticed the first odd thing. I was watching Hardware Unboxed? is it? I think I just caught Nvidia out.



OK so why am I saying that?

Well firstly we all know that the 1080Ti was hardly any slower than the 2080. In pretty much everything. However, look how much better the 2080 was in Doom Eternal at 1440p. So it was an exception to the rule.

And then, and here is the massive hole in Nvidia's BS benchmarks, look at 4k.

Why is the 2080 losing so badly at 4k to the 1080Ti? that can't be right surely?

Oh it is. It totally is. Want to know why? as per the video I just watched "The 2080 here is running out of VRAM. The game is asking for 9.5gb".

So? there goes your "The 3080 is twice as fast as the 2080 in Doom Eternal at 4k !!!1111oneone". It's a loaded test. Nvidia KNEW the 2080 would run out of VRAM and perform like e.
Kind feels like you are reaching a little. Their statement still stands regardless of the criteria.

For a consumer, using a 2080 and trying to play at 4k, they are going to see this and appreciate the difference. Doesn't matter about VRAM in the slightest. End result is what matters to most people. And the fact is, the 3080 performance is what it is.
Reply With Quote
  #7  
Old 04-09-20, 12:35 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 14,767
That is real data. It was tested yesterday and uploaded today.

It's also a clear warning that the 3070 will not be a 4k card. It can't be, it doesn't have enough VRAM.

I don't agree with the price/model stacking Mark. None of us did ! the 2080 was very expensive. The 3080, right now, is the replacement for the 2080Ti (given the 1080Ti was £699 or so) and thus it is now the Ti. Forget the 20 series stacking and prices. The 3090 is the new Titan, and is the same price.

The very fact they are comparing the 2080 to the 3080 is wrong any way. It should be compared against their last Ti series card that cost twice as much as it should have.

I just don't buy the marketing. When the facts come out? fair enough. But they always pull this BS to make things look much better than they really are.
__________________



If you don't like what I post don't read it.
Reply With Quote
  #8  
Old 04-09-20, 12:37 PM
AlienALX's Avatar
AlienALX AlienALX is offline
OC3D Elite
 
Join Date: Mar 2015
Location: West Sussex
Posts: 14,767
Quote:
Originally Posted by Warchild View Post
Kind feels like you are reaching a little. Their statement still stands regardless of the criteria.

For a consumer, using a 2080 and trying to play at 4k, they are going to see this and appreciate the difference. Doesn't matter about VRAM in the slightest. End result is what matters to most people. And the fact is, the 3080 performance is what it is.
Then they are foolish.

VRAM is a huge part of gaming. Mostly because, well, look what happens when you run out. Nvidia are totally pimping 4k gaming now right? well I can categorically tell you that 10gb won't remain enough for long.

If you want to continue the argument? talk to Kaap. He is the expert on this.
__________________



If you don't like what I post don't read it.
Reply With Quote
  #9  
Old 04-09-20, 12:45 PM
Warchild Warchild is offline
OC3D Elite
 
Join Date: Feb 2013
Location: Norway, Oslo
Posts: 6,929
Quote:
Originally Posted by AlienALX View Post
Then they are foolish.

VRAM is a huge part of gaming. Mostly because, well, look what happens when you run out. Nvidia are totally pimping 4k gaming now right? well I can categorically tell you that 10gb won't remain enough for long.

If you want to continue the argument? talk to Kaap. He is the expert on this.
Im not disagreeing with you. We, that is those, who have a fraction of knowledge with GPUs, know the importance of VRAM.

But others just want something that gives them the outcome they want, and Nvidia have marketted this cleverly to present a worthwhile upgrade for the same pricepoint. Might be shady (which imo isnt) but certainly not liars.

the whole 3.5gb fiasco is most certainly unacceptable. But in this scenario, they are giving their viewer an appealing upgrade path.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump










All times are GMT. The time now is 09:38 PM.
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2020, vBulletin Solutions, Inc.