Quake II RTX Revealed By Nvidia - Q2VKPT Just Got Better

Yes. And since almost everyone plays Quake for its multiplayer, I thought that Quake 3 would be an obvious choice for a graphical upgrade.

actually no. Since multiplayer twitch shooters sacrifice ALL fidelity for the sake of higher FPS. It is also the reason why DICE disabled SLI in Battlefield V because they realised that users were turning the game into a minecraft quality detail and settings everything as low as possible, in order to get the jump on the opponent. It allowed you to spot users behind obstacles and take them out. A clear exploit that was found when utilising SLI and modifying a few SLI bits via Nvinspector.

Disabling SLI prevented you from lowering the detail beyond the minimum.

Pro Quake teams still practise on CRT monitors in order to maximise FPS I believe.

Online shooters, FPS is everything.
 
I think it would be more for latency,a CRT monitor should have 0 latency on top of frame render time while a digital display will never achieve this, with multisync versions you can also often push them to around 200Hz depending on the resolution you dial in, your GPUs RAMDAC and the phosphor type used on the screen are the only real limits. I still used one till I had to move a few years ago and realised bringing something I was physically unable to carry myself was a bad idea, besides the whole 200W power draw thing.

But I somehow sold it for £180 in 2015, after buying it for £20 a couple years before, rare beauties.

https://www.cnet.com/products/sony-gdm-fw900/
 
I think it would be more for latency,a CRT monitor should have 0 latency on top of frame render time while a digital display will never achieve this, with multisync versions you can also often push them to around 200Hz depending on the resolution you dial in, your GPUs RAMDAC and the phosphor type used on the screen are the only real limits. I still used one till I had to move a few years ago and realised bringing something I was physically unable to carry myself was a bad idea, besides the whole 200W power draw thing.

But I somehow sold it for £180 in 2015, after buying it for £20 a couple years before, rare beauties.

https://www.cnet.com/products/sony-gdm-fw900/

They are actually extremely profitable these days. I sold mine for the same price I bought it for, so no profit, but nothing lost either. And that was a number of years ago.
 
Yeah, shame most modern GPUs have all digital outputs, rendering them useless for anything above 1920x1200@60Hz (Which is so massively below the base refresh rate and resolution of that monitor that you just get a blurry flickering image as a result) since external analogue adapters top out at a 160MHz RAMDAC unless you drop another hundred or so on an adapter(Though it'd be cheaper to just buy a 2nd old GPU for the VGA output). The RAMDACs on later models like my HD7870XT could easily hit 600Mhz, allowing them to push 4K or ultra high refresh rates on a multisync monitor like that one a good 5 years before digital connections could do it well.
 
Last edited:
Yeah, shame most modern GPUs have all digital outputs, rendering them useless for anything above 1920x1200@60Hz (Which is so massively below the base refresh rate and resolution of that monitor that you just get a blurry flickering image as a result) since external analogue adapters top out at a 160MHz RAMDAC unless you drop another hundred or so on an adapter(Though it'd be cheaper to just buy a 2nd old GPU for the VGA output). The RAMDACs on later models like my HD7870XT could easily hit 600Mhz, allowing them to push 4K or ultra high refresh rates on a multisync monitor like that one a good 5 years before digital connections could do it well.

I really do miss my CRT... Always remember dropping to 800x600 for games, but never experienced FPS complaints. Of course back then I was oblivious to most concerns we have today. Still didnt stop me enjoying the smoothness that we no longer see today.
 
actually no. Since multiplayer twitch shooters sacrifice ALL fidelity for the sake of higher FPS. It is also the reason why DICE disabled SLI in Battlefield V because they realised that users were turning the game into a minecraft quality detail and settings everything as low as possible, in order to get the jump on the opponent. It allowed you to spot users behind obstacles and take them out. A clear exploit that was found when utilising SLI and modifying a few SLI bits via Nvinspector.

Disabling SLI prevented you from lowering the detail beyond the minimum.

Pro Quake teams still practise on CRT monitors in order to maximise FPS I believe.

Online shooters, FPS is everything.
I know. But it's a very old game and it's far more popular. It would have been better for a showcase IMHO.
 
Back
Top