AMD obliterates Nvidia in early Battlefield 5 benchmarks

Moving into DirectX 12 the performance gap widened, with the average and minimum framerates of Nvidia's GTX 1060 dropping to 41 and 29 FPS respectively, while AMD's RX 580 maintained the same average framerate and achieved a higher minimum framerate of 59FPS.
I only understood that part by looking up the actual review :D In DX12 at 1080p the 580 achieves the same 68FPS that it achieves in DX11.

Deffo a result that many people didn't expect at all.
 
Better minimum in DX12 though.

Let's see if Nvidia optimise Pascal, or just ditch it and move onto their new tech.
 
it´s like saying:


"mercedes was way better during training than ferrari..... thought ferrari had only three tires attached."


it´s aknowledged in the article but the article basicaly says not much of value.


While AMD's performance here is impressive, it must be remembered that Battlefield V was in beta at the time, lacking optimised drivers and the same levels of polish that the final game will offer. DICE has plenty of time to improve the game's performance on both AMD and Nvidia hardware, so these results are not necessarily indicative of the game's release day performance.
 
Having minimums that are better then the other camps average is a huge gap in both DX11 and 12
 
Nvidia solution :"No problem,just cut those unseen objects and textures in distance and we are good to go."
 
Nvidia solution :"No problem,just cut those unseen objects and textures in distance and we are good to go."

There was actually a big debacle about this on the OCUK forums when the Fury X and 980 Ti were going head to head, Side by side screenshots in BF4 showed the 980 Ti had less distant detail and slightly blurrier textures than the Fury X, Giving the 980 Ti a performance advantage.
 
DICE was always AMD partnered before this...Nvidia title now though right? So I doubt this says much about the release state of the game...unless DICE double-crossed them LOL

Or maybe it's just normal gameworks...
 
Nvidia solution :"No problem,just cut those unseen objects and textures in distance and we are good to go."

That's called culling. Every modern GPU architecture in the past handful of years has done this. Nvidia is just the most aggressive with it. It's not a bad thing to use culling and associating as something negative as you are doesn't make sense.
 
Back
Top