Battlefield V has been updated with support for DXR Ray Traced Reflections

Rtx

Watched some BATTLEFIELD V gameplay today.


8700k 2080ti ,1080p gameplay getting around 100fps with ray tracing turned on. with a system this spec i would rather be playing at 4k or 1440p with ray tracing turned off.
££££££££ to play at 1080p ... lol
 
Watched some BATTLEFIELD V gameplay today.


8700k 2080ti ,1080p gameplay getting around 100fps with ray tracing turned on. with a system this spec i would rather be playing at 4k or 1440p with ray tracing turned off.
££££££££ to play at 1080p ... lol

Considering 4 months ago even running it wouldn't be possible to do in real time, it's quite a feat to even hit 100fps.

So the money is justified in that sense. However outside of that from a consumer perspective it is not worth the money. Which I agree with you on that
 
Considering 4 months ago even running it wouldn't be possible to do in real time, it's quite a feat to even hit 100fps.

So the money is justified in that sense. However outside of that from a consumer perspective it is not worth the money. Which I agree with you on that

Hold on there a minute. It's still not possible to run the whole game of Ray tracing. The game still use Rasterization for everything else than water and reflections.

And it still has a major impact the framerate. We'd still only be seeing hybrid ray tracing for a some time.

"Considering 4 months ago even running it wouldn't be possible in real time." Is what Nvidia want you to say, BFV still isn't 100% Ray Tracing, and a 1080ti would have the exact same performances. DLSS is where we might see some nice performance gain with continuous driver supports and that will also take time.
 
You can have your opinion. As someone who has talked to developers they all can agree it's damn impressive. I'm not at that point in my education where I can fully appreciate Ray Tracing on a technical level, but it's still impressive.
 
Hold on there a minute. It's still not possible to run the whole game of Ray tracing. The game still use Rasterization for everything else than water and reflections.

And it still has a major impact the framerate. We'd still only be seeing hybrid ray tracing for a some time.

"Considering 4 months ago even running it wouldn't be possible in real time." Is what Nvidia want you to say, BFV still isn't 100% Ray Tracing, and a 1080ti would have the exact same performances. DLSS is where we might see some nice performance gain with continuous driver supports and that will also take time.
1080Ti doesn't have the hardware for RTX.

Sure, it's a performance hampering gimmick for now, but it's not behind a mere software lock.
 
Watched some BATTLEFIELD V gameplay today.


8700k 2080ti ,1080p gameplay getting around 100fps with ray tracing turned on. with a system this spec i would rather be playing at 4k or 1440p with ray tracing turned off.
££££££££ to play at 1080p ... lol

Even what you posted here is far higher than what was tested at Guru3D unless this video used low settings?. At 1080p ultra with DXR on it tanks down to around 40-50fps.

its a massive hit on performance. I wonder if the day 0 patch has a few issues that remain.
 
1080Ti doesn't have the hardware for RTX.

Sure, it's a performance hampering gimmick for now, but it's not behind a mere software lock.

Ray tracing is computing, any cards could compute it, but to which extent though is the question. Ray tracing is a Rendering technique, you can ask anything to compute it. Is your hardware good enough to do it in real time though?

So yes, it's hard coded as a software lock, because there's no way older cards could have results and they want to sell a new tech, but I would still expect a 1080ti, if the drivers and software permitted it, to be able to produce ray traces in real time.

It would suck, and suck even more than the RTX 2070, but Ray Tracing as a rendering technique never required "special RT cores". Nvidia's version of RTX "does".

But tell me, since Nvidia added hardware to get real time RTX, why is it affecting FPS so much, even in environment where there's no reflection? If it's ONLY the extra hardware handling RTX, overall FPS(even in no reflection scenes) should not be impacted as much.

DLSS might be game changing in the near future, but selling 1200$ GPUs so you can get under 50fps in a game where only reflections are ray traced? RTX isn't ready yet, we need cards about twice as powerful than the 2080ti and that's for REFLECTIONS only.
 
Sorry JeffDee but that's absolute fairy talk. There are realtime raytracing demo's you can run on a 1080ti, yes, but it'll take a very liberal interpretation of the word realtime. Here's the famous star wars demo on the 1080, 1080ti and RTX cards:
https://vimeo.com/290465222

Dedicating raytracing hardware & cores is a concept that's been around for decades. Yes, almost any type of processing core can be used for almost any type of calculation, but some are much better at certain types than others(And that's a massive oversimplification, but it's why all modern CPUs and GPUs will have at least tens of distinct type of logic/processing unit within each "core"). But even back when raytracing was purely the realm of 3D model/scene artists taking still screens there were companies releasing hardware optimised for the heavier portions of the ray tracing pipeline to make sure those still screenshot renderings weren't hogging your workstations CPU for days.

Just because the RT cores can't do the RT as fast as the mature shader & rasterisation units can handle the rasterisation side of things(That's been optimised from a hardware & software perspective for decades) doesn't mean it's not using dedicated logic blocks to achieve that performance, it just means the computation those blocks has to do is still far more intensive and unoptimised than rasterisation(Which is wholly expected, given not that many years ago when this calculation was often done on hundreds of rackmounted CPUs we'd usually talk about rendering raytraced frames in a timescale closer to frames per day). There's a reason why a Turing die has over 1.5x the number of transistors but only a fractional bump in traditional shader performance.

I don't think anyone has any expectation to be rendering full scenes in realtime for gaming within the next half decade at least regardless of hardware optimisations. That'd be an incredible feat but also a semi pointless one given many aspects of a scenes rendering wouldn't really benefit visually in a meaningful enough way to justify the fact you're using a pipeline that is inherently, essentially mathematically impossible to be as efficient as rasterisation.

Next generation RTRT hardware will definitely offer a noticeable performance bump as kinks are ironed out, but it won't be the bump we got from no RT hardware to having it. From there on I suspect a lot of RT performance gains will be from shifting the allocation of dedicated cores to a higher %age of RT hardware against traditional shader hardware as more games start to use it, then we can have a small cluster of highly efficient traditional shaders for doing the bread & butter with a large cluster of RTRT shaders to allow them to "spice up" the scene while still keeping up with the traditional portion. Finding that balance will also be a matter of optimising software and finding new shortcuts and the like too ofc.

Using raytracing for anything other than reflections, refractions, shadows, ect is mostly pointless with current and likely future technology for upto a decade forward.
 
Last edited:

I think the most damning, poignant thing he says there is "You would be hard pushed to notice it".

Yesterday I went looking for RT footage on YT. I finally managed to find a scene from MP where the guy was running around and around a frozen piece of shiny ice and finally after about a minute I was able to make out what it was doing. At first I wondered why he kept running around in a circle, then it dawned on me he must have been mouthing "Look here ! look here!".

The problem is that unless you stand perfectly still and then spend ages looking for it you just can't see it. This was something I mentioned a while ago, about how fast paced action scenes are about moving and staying 'yo ass alive, not standing there looking for effects. The RT we have right now? would be ideal for a slow paced adventure game, where you can just stand about looking at the world around you. Something like Ethan Carter.

What's even more damning for this gen of RTX cards though is the 2070. Loads of people bought those. Thing is? they are barely capable of low in this game, how are they going to fare when Metro turns up?.

This is out of order of Nvidia IMO. That scene they showed in BFV in the street with the *enormous* puddles and RTX were obviously staged, and they gave no mention whatsoever of the actual hardware that was running it (probably four Titan V).

I mean that scene in the demo? yeah, RTX did look impressive. So colour me gobsmacked when I found out you actually have to run around a puddle for 5 mins to actually see it.

Even the effect on the gun in that guy's video up there is pathetic. That scene would look better at 4k with RTX off.
 
NVidia do say the top 3 settings for RT in BFV are currently bugged, and this is clearly a fairly rushed & buggy implementation as it is now, it's inevitable that the first time any technology is implemented it's probably going to be in a way that's far from maximising its on-screen potential. I don't think it's really sensible to pass judgement on even current RT hardwares capabilities until there's been a long enough period of user feedback for implementations to be relevant to users. Have you ever compared games early on in a consoles lifecycle to ones towards its end? Same hardware but a world of difference in techniques & utilisation.
Optimising graphics hardware is not just about the efficiency of the hardware at doing a computation, but also heavily relies on psychology & visual tricks, and always has done. Getting the most out of graphics hardware is more about working out where you can cut corners without it being too visually noticeable than how much you can speed up operations in the post Denver scailing world.

This is why we will likely never see ubiquitous "True 4K" or full-scene raytracing on consoles. Because you can get 95% of the result for a fraction of the computational power by using the right techniques. Even more so once realtime AI based image manipulation becomes a standard in the post-processing pipeline. Future graphics pipelines will be mostly low-precision calculations with lots of fast & dirty algorithms for "fixing" the image afterwards, but as long as you don't notice that basically just means faster rendering.
 
Last edited:
Some one over at OCUK forum just said it was like early tessellation. I agree, tbh. It was barely noticeable in Dirt 2 and you literally had to pull up your car, then take screenshots and compare the two.

Nvidia have totally over sold it, tbh. Even the guys with 2080Ti are saying "Well I will play without RTX for now as the performance hit isn't worth it".

Give it 4 years it'll be a game changer. For now? yeah, neeeeext.
 
Back
Top