Oxide Developer says "NVIDIA was Putting Pressure on them" to change their DX12 Bench

More nVidia slander? (not directed at you WYP). Then again, seems very believable that nVidia would be doing this.
 
Last edited:
Logan over at TS will be happy. He like many other have claimed all along that AMD will get an advantage with DX12.

Just like when the Crysis benchmark that came out had some Nvidia shenanigans, and Crysis 2 had the tessellation field to cripple AMD.

I think a few driver revisions might fix some of the disadvantage. Just like AMD changes tessellation on a driver level, I think Nvidia can overcome some of this.
 
I guess it's time to shove GameWorks more aggressively in more titles in order to make Nvidia seem like a better choice. Because god forbid that Nvidia does something that benefits the consumers.
 
I guess it's time to shove GameWorks more aggressively in more titles in order to make Nvidia seem like a better choice. Because god forbid that Nvidia does something that benefits the consumers.

Of course, that is their absolute first priority (sarcasm BTW)
 
Doesn't surprise me really, They've been slightly lowering IQ through their drivers to get an FPS advantage for quite some time ergo games look ever so slightly washed out and lack a few details here and there when playing on my mates TX rig, Example below -

3VTrDqG.jpg
 
Doesn't surprise me really, They've been slightly lowering IQ through their drivers to get an FPS advantage for quite some time ergo games look ever so slightly washed out and lack a few details here and there when playing on my mates TX rig, Example below -

3VTrDqG.jpg

Funny how there wasn't a big uproar about it though. At least it can be reversed if you want the IQ.

Anyone still remember the time nVidia attacked AMD saying that AMD was lowering the image quality? The iorny.
 
Last edited:
IF AMD started compressing the data like nvidia does, amd would be equally fast... Now that DX12 has something NVIDIA didn't care about building in their architecture but amd did... That's a bad desicion for them
 
As much as I would like to agree with you, it is not quote true. There are many variables that determine its 'speed' perse.
 
Lately, Nvidia is becoming more open about cheating out their competition. It does us pc gamers no justice by skewing numbers to make their product look better than it actually is or crippling certain features so that their competition performs crappy. We heavily rely on these benchmarks to help us decide which product is the best to get in our budget. I'm so tired of one game working better than another when the other card is more than capable of running the game just as well.
 
Doesn't surprise me really, They've been slightly lowering IQ through their drivers to get an FPS advantage for quite some time ergo games look ever so slightly washed out and lack a few details here and there when playing on my mates TX rig, Example below -

3VTrDqG.jpg

Strange how capturing a still of a video in motion can appear blurry if you do it at the wrong time!
 
Doesn't surprise me really, They've been slightly lowering IQ through their drivers to get an FPS advantage for quite some time ergo games look ever so slightly washed out and lack a few details here and there when playing on my mates TX rig, Example below -

3VTrDqG.jpg

As a user of both I can say that there is no big differences in picture quality as the image above is trying to imply.

One thing people need to be careful of is that they run their monitor at it's native resolution. My 2160p monitor looks total rubbish if I run it at 1080p and really does look bad with NVidia cards.
 
That screengrab has motion blur written all over it. If you can provide a video with this turned off, then we'll talk :)
 
Back
Top