Gears 5 Dev reveals "dedicated ray tracing cores" on Project Scarlett

Once consoles get this Nvidia are screwed.

I've mentioned over the past few months just how impressive I've found consoles to be, and it made a lot of sense as to why PC gaming felt so stale and held back. It was because of the huge progress made after the Xbox 360. Progress I had not witnessed.

I had a toss up over which format to buy BL 3 for and I ended up going with the Xb1x. Why? 4k 60 apparently. This is the first time in my life that I've ever deliberately chosen to buy a game on console rather than pc. It was the same price and I can sit on my sofa using my 65" TV. Why would I want it on PC?

I'm not joining the Epic hate campaign but if I can't have guaranteed cloud saves and am thus limited to one pc then I'm just going to buy it on console.

Any way I digress. Sorry.

As soon as RT comes along on consoles IMO the pc is going to be in big trouble. Nobody, not even those who bought them, likes the 20 series prices and this will just hammer that home.

Also, whilst I'm here ranting and moaning. I heard Epic games were charging the Devs far less than Steam. I thought this would be good for gaming but the prices have increased hugely.

Good pc games are now £50-£60. Yet there's no license fee. It was actually cheaper for me to buy for the 1x tbh.

This, IMO, was the main reason why pc gaming made a resurgence, but that will soon die out if the prices become a joke.
 
Does this guy who said anything have half a brain? He literally says he won't talk about new hardware then continues on to talk about the new Ray Tracing hardware? Great job on keeping it a secret I guess?

Either way RT was coming and I assume most people figured it would. I however don't think it'll be a great RT experience and people will not care about it. I just hope console players from now have a choice of native 4k/graphics vs a 4k/FPS option or even a native 1440p/120FPS option.
 
Once consoles get this Nvidia are screwed.
Nope. They'll just move on to the next gimmick. First they tried to appropriate physics to run on their GPU only instead of the CPU as god had intended. Then they tried to appropriate adaptive sync and market it as something unique to their hardware and now they're trying the same with ray tracing.


They know that it won't last. But gamers are dumb creatures who keep falling for Nvidia's scams and will continue to do so in the future. Nvidia even commits fraud every couple of months and no one seems to care. It is remarkable what they've been able to get away with.
 
Nope. They'll just move on to the next gimmick. First they tried to appropriate physics to run on their GPU only instead of the CPU as god had intended. Then they tried to appropriate adaptive sync and market it as something unique to their hardware and now they're trying the same with ray tracing.


They know that it won't last. But gamers are dumb creatures who keep falling for Nvidia's scams and will continue to do so in the future. Nvidia even commits fraud every couple of months and no one seems to care. It is remarkable what they've been able to get away with.

Only their gimmick doesn't work when it costs twice as much as a full gaming console.

Where's 3Dvision? And Physx? And etc? They're gone because they didn't catch on. (As money spinners and you can add Gsync to that also)

As for dumb? I'd say I'm anything but. That's why I'm back on consoles. Because that's where the real progress has been happening. Progress I find far more exciting than gimmicks.
 
Once consoles get this Nvidia are screwed.

I've mentioned over the past few months just how impressive I've found consoles to be, and it made a lot of sense as to why PC gaming felt so stale and held back. It was because of the huge progress made after the Xbox 360. Progress I had not witnessed.

I had a toss up over which format to buy BL 3 for and I ended up going with the Xb1x. Why? 4k 60 apparently. This is the first time in my life that I've ever deliberately chosen to buy a game on console rather than pc. It was the same price and I can sit on my sofa using my 65" TV. Why would I want it on PC?

I'm not joining the Epic hate campaign but if I can't have guaranteed cloud saves and am thus limited to one pc then I'm just going to buy it on console.

Any way I digress. Sorry.

As soon as RT comes along on consoles IMO the pc is going to be in big trouble. Nobody, not even those who bought them, likes the 20 series prices and this will just hammer that home.

Also, whilst I'm here ranting and moaning. I heard Epic games were charging the Devs far less than Steam. I thought this would be good for gaming but the prices have increased hugely.

Good pc games are now £50-£60. Yet there's no license fee. It was actually cheaper for me to buy for the 1x tbh.

This, IMO, was the main reason why pc gaming made a resurgence, but that will soon die out if the prices become a joke.

You're fooling yourself man! Consoles doesn't do 4k 60fps they do 1080p that can dinamycally scale up to like 1600p when you're looking into the sky and then, that image is upscaled to 4k. You could notice the downgrade in IQ if you weren't sitting far away from the screen. You can achieve that on PC making use of AMD's or Nvidia's filter library, enabling the sharpen filter, then you can go to your game's settings, put it in like 75% rendering resolution and even up close to the monitor you'll have a hard time noticing the difference, you can go lower if you output it into your TV and then sit far away from it, you'll also not notice the difference the same way you don't when you're playing on console. That's upscaling. I personally don't usually like consoles because to keep framerates they also tend to give up on anti aliasing and I hate aliased edges. But yeah you can get same or even better performancee than consoles with low end PC by using the same tricks consoles do. The output is 4k in resolution but it was rendered in less than 4k and that's why it has good performance.
 
Last edited:
The fact that PC doesn't have the same calibre of hardware upscailing as consoles is a massive hit against PC atm imo, but as you point out we do have band aid solutions that are rapidly improving, though I'd still say the checkerboard rendering the PS4Pro's hardware enabled is a big step above the techniques you can use currently on PC. I've spent a lot of time sitting less than a meter away from my friends 42" 4K screen with Xbox One and it without a doubt looked a world better than 1080p, mostly because 4K on consoles removes aliasing from the question entirely, even if it's not true 4K you now get a crystal image with no jaggies possible on consoles.

But yeah, fundamentally PC and the fixed home consoles use almost exactly the same hardware architecture/tech now, just with a different hierarchy, though the consoles hierarchy though does mean they will always be able to deliver better value for money with exactly the same tech(Albeit at the cost of modularity/upgradability as this is derived from the fact consoles can use large APUs with GDDR memory thanks to their soldered connections, sockets ruin this but most would agree are fundamental to enthusiast PC's, although discreet hardware will never be able to compete on value).
 
Last edited:
But yeah, fundamentally PC and the fixed home consoles use almost exactly the same hardware architecture/tech now, just with a different hierarchy, though the consoles hierarchy though does mean they will always be able to deliver better value for money with exactly the same tech(Albeit at the cost of modularity/upgradability as this is derived from the fact consoles can use large APUs with GDDR memory thanks to their soldered connections, sockets ruin this but most would agree are fundamental to enthusiast PC's, although discreet hardware will never be able to compete on value).

Yeah, that's true when talking strictly about gaming, consoles will always be better price/performance, after all, devs can optimize everything for the consoles as they have fixed hardware and they actually do it, usually, PC Hardware ends up falling short even though it's more powerful because of that. I personally cannot justify buying a console at the moment as I can perfectly play any game I want with my PC, except exclusives, and get great visuals, I can even use a little of the console tricks to get better performance as I do in FFXV, but mainly, I don't buy a console because I need a PC to work and I end up using it for a lot more, like hosting servers and I can't do that with a console. For me it's better to spend the money upgrading my PC so I can get work done faster than buying a console just for playing games. If I only used my PC to play, then, I reckon I'd have ditched it for a console long ago. I still think most of the performance loss on PCs are due to bad ports tho. Tomb raider 2013 being my evidence here.

That Said, I do output my PC signal to a TV when I'm playing and up untill last week what I did was just output 1080 to it and let the internal upscaler of the TV bring it to 4K, I couldn't spot the difference in visuals from the couch, but I had to deal with a little more input lag, now thanks to latest sharpen filters I can output 4k directly having it upscaled by the GPU drivers and that works better. What I do disagree with AlienALX statement is this:

As for dumb? I'd say I'm anything but. That's why I'm back on consoles. Because that's where the real progress has been happening. Progress I find far more exciting than gimmicks.

I really can't see how progress has been happenning on consoles, the only actual tech that was developed for consoles are upscaling and adaptive rendering techniques, all of those were already available as software in PC, just no Dev ever bothered to implement it nor makeit better, but Yeah, I was learning 3D with adaptive rendering in Architectural visualisation long before consoles got it. That said, everything else is developed PC first as PC has the powerful components to do that, consoles get it years later after hardware has matured or Devs find a new way to bleed more performance out of the console hardware. You can argue PhysX but the only Reason PhysX didn't caught was becasue it was porprietary so it was clearly better to develop you rown physics engine even if it wasn't as good, Nvidia did release PhysX libraries open source and a lot of game engines are taking advantage of that we just don't know it usually, same goes for Gsync. Now 3D didn't work for anyone except theaters, no one really couldn't do anything here.

Yet, Realtime Raytracing, Adaptive Soft Shadows, realistic Hair, Good ambient occlusion, all that was developed PC first. No matter if Nvidia did it or AMD did it or whatever, Consoles are only getting those now, and in limited supply. Crytek made a great Raytracing code with their Cryengine recently and that's PC first also, although I don't expect to see it anywhere actually. Consoles just have a pressure because they need to keep up with PC's graphics but working with really low end hardware and that is remarkable by itself indeed. but not any new leap in graphical fidelity happens console first these days.

What's beautiful about consoles is they tend to frantically search for every bit of advantage they can get and go for it while PC tends to go brute force only, but that brute force is what allows devs to innovate and develop new rendering techniques that eventually finds it's way down to consoles.
 
You're fooling yourself man! Consoles doesn't do 4k 60fps they do 1080p that can dinamycally scale up to like 1600p when you're looking into the sky and then, that image is upscaled to 4k. You could notice the downgrade in IQ if you weren't sitting far away from the screen. You can achieve that on PC making use of AMD's or Nvidia's filter library, enabling the sharpen filter, then you can go to your game's settings, put it in like 75% rendering resolution and even up close to the monitor you'll have a hard time noticing the difference, you can go lower if you output it into your TV and then sit far away from it, you'll also not notice the difference the same way you don't when you're playing on console. That's upscaling. I personally don't usually like consoles because to keep framerates they also tend to give up on anti aliasing and I hate aliased edges. But yeah you can get same or even better performancee than consoles with low end PC by using the same tricks consoles do. The output is 4k in resolution but it was rendered in less than 4k and that's why it has good performance.

I'm not fooling myself. I've been gaming for 40+ years and whilst I do see the difference on my PC it's usually barely noticeable. There is a reason for that. 4k close up at a desk looks great. 4k however you spin it on a TV from 10ft away looks every bit as good. Now if I take a screenshot of that console running at 4k and compare it side by side to the PC? sure, there's a difference there. However, from the viewing distances you can not tell the difference. It's another "Oh have we all got superhuman powers again now?"

The biggest drawback on consoles for me, and why I got rid of my Xbox 360 after four days, was the lack of aliasing. THAT *is* noticeable no matter how far you sit back. However, with 4k or upscaling as high as the console can? there is no shimmer or what not to distract you and thus what you see is very smooth.

Like I said, I was not aware at just how close to the PC games on the consoles had become. It's obvious why too, because they are basically being developed for the two consoles BEFORE the PC is even taken into consideration. So you just have to hope that at the end they will do more work to the PC version to make it stand out and hey, they don't bother.

In fact, many PC Games these days start up at some crap resolution and you can not even get to the settings BEFORE the game has started. So you have to wait through a 30 minute intro cut scene just to be able to tweak the graphics. They couldn't even be assed to make a decent menu.

And that is what I meant when I said I could not believe how far consoles had come in the time since I last had one. They really have. PC gaming has obviously as usual been an afterthought, but it does seem a very strange coincidence that stuff like Crossfire and SLi support seemed to fall off a cliff when the Xbone and PS4 came out. Why? because they are X86, so they had to do 0 to get it running on PC, compared to before where they had to do the work.

And it shows. Like I said, 40+ years I have been gaming and from 10ft a game on a console with no rough edges and etc looks every bit as good as being sat at my PC. Only the consoles are FAR cheaper (less than a mid range GPU alone) and about 100x easier to get running as you have 0 issues. Every PC game I have played recently has been a right sod to get working. Even the last DLC for BL2 continually crashed saying out of VRAM. A funny sight to see when you're on a Titan XP running 1440p.

You can keep on going if you like. Nobody on this forum has been around as consistently as I, and nobody here was a bigger PC gaming fan than I. But I'm telling you, it is being severely held back because these games HAVE to run on consoles. The only game in recent history to push this is Control, so I would enjoy that whilst you can because it's tremendously rare for them to nerf a console release to show off on the PC. Nvidia must have paid handsomely for that, it's the first real push on the PC since Crysis.

But then let's not forget they probably (Well they do, see the article) have their hands on the next gen dev boxes now with the new hardware in.
 
PT2.

This is exactly what AMD wanted. They now have their tech in two consoles, and that is more than one so they will get the support. Devs have no choice now who to support first.

Couple that with the problems of the Epic game store FOR ME (I will explain) and my last purchase was Borderlands 3, for the Xbox, shall be running it at 4k 60 on my Xb1x. Why? Because Epic is still in BETA IMO. No cloud saves, so that means I can only play on one of my PCs half of the time. The other is at my mother's house, where I spend Sat-Tues. So I may as well buy it on the Xb1x, as the price was £10 more for the PC code.

RTX is one tech available on a small amount of cards supplied by one manu. Once Intel enter the market RT will become even more of a hassle to make games for. And as I said, the last PC exclusive (because Control isn't even an exclusive even though it support RT) was Crysis and look how long ago that was.
 
I think a big part of the reason there's less of a difference in noticeable graphical fidelity now is because traditional raster graphics have kind of hit a wall and not really had much in the way of meaningful improvements for over half a decade. It's kinda the same reason why the resultant fidelity jump between each generation gets smaller and less noticeable at each one even with larger and larger absolute power increases. So many games nowadays look great at minimum settings and just slightly better at ultra because of this.

but it does seem a very strange coincidence that stuff like Crossfire and SLi support seemed to fall off a cliff when the Xbone and PS4 came out. Why? because they are X86, so they had to do 0 to get it running on PC, compared to before where they had to do the work.
I think the drop in popularity in mGPU use that led to the drop in focus was a combination of the fact that truly large die GPUs had started to exist with Kepler(Titan) and GCN(Huwaii) so mGPU's "sensible price range" increased significantly beyond what most people spend on PCs(Combined with the removal of SLI headers on NVidia's mid end down cards) as well as all the research around 2012 on into frame timings and the issues of frame consistency and latency inherent to AFR setups that effectively killed its popularity for mid/low end cards as this is where those issues were exasperated, both these came before the current gen console launch and were already seen back then as the nail in the coffin for a fading technology(Though indeed there was more optimism regarding the viability of some alternatives to AFR for mGPU use back then, but these all seemed to fall short or require very specific games in practice).

RTX is one tech available on a small amount of cards supplied by one manu. Once Intel enter the market RT will become even more of a hassle to make games for. And as I said, the last PC exclusive (because Control isn't even an exclusive even though it support RT) was Crysis and look how long ago that was.
I have to disagree with this one, when Intel and AMD bring gaming orientated raytracing support to GPUs it will be through the DXR API that they & NVidia developed exactly for this purpose and that all current raytracing games available use, Microsoft took extreme care with DXR to make it completely hardware agnostic, to the degree that you can run the API via CPU right now if you wanted, although you'd get a very slow slideshow in all but basic few polygon demos. There will need to be some new code paths without a doubt but this would be a fraction of the work of the full implementation and the similarities between Turing and RDNA now are quite various, as well as the overall approach to accelerating raytracing, and given the engineers leading Intel's GPU development it's unlikely their arch will be a world apart either, especially as it would hinder uptake so significantly with modern APIs fidelity.
 
Last edited:
Won't AMD use vulkan for RT? Then again that should run on Nvidia right?

Sorry for short reply, now back on phone.
 
AMD were key to the development of DXR(An extension for DX12), just as NVidia and Intel were, just as all GPU vendors fundamentally dictate the way any graphics API will go as it's their hardware it has to work on, we know the Xbox Next will use DXR for its raytracing, and so we know AMD GPUs will have DXR support(And they've already had testing functionality in the RVII's drivers in the past). Nothing stopping from Khronos adding similar extensions to Vulkan(And they've indicated they will), or from a developer doing raytracing via Vulkan with their own work, but that would be a lot of extra work when the industry(Hardware vendors, game engine developers and game developers alike, as well as MS ofc) has mostly already thrown their weight behind DXR.
 
Only their gimmick doesn't work when it costs twice as much as a full gaming console.

Where's 3Dvision? And Physx? And etc? They're gone because they didn't catch on. (As money spinners and you can add Gsync to that also)

As for dumb? I'd say I'm anything but. That's why I'm back on consoles. Because that's where the real progress has been happening. Progress I find far more exciting than gimmicks.
You missed my point entirely.


NVIDIA KNOWS.



They know that their gimmicks won't last. They just don't care. And they will keep doing the same thing because it works every time. Even if it doesn't last, people still end up buying their overpriced gimmicks.
 
You're fooling yourself man! Consoles doesn't do 4k 60fps they do 1080p that can dinamycally scale up to like 1600p when you're looking into the sky and then, that image is upscaled to 4k.

While some games do this, not all do and the Xbox One X can run 4K fine.

I personally don't usually like consoles because to keep framerates they also tend to give up on anti aliasing and I hate aliased edges.

This is hasn't been true for some time since the rise of TAA and how effective it is at not only edge AA but shading AA to hide fireflies from PBR for such little cost.
 
Back
Top