ASUS Reveals their 32-inch ROG Strix XG32VQR 1440p 144Hz FreeSync 2 HDR Monitor

1440P 16:9 at 32" is like having 1080P at 27", It's not that nice, I know as I did have a Samsung Freesync 2 32" 1440P monitor a few months ago, Anything over 27" should either be 4K or 21:9 1440P.
 
1440P 16:9 at 32" is like having 1080P at 27", It's not that nice, I know as I did have a Samsung Freesync 2 32" 1440P monitor a few months ago, Anything over 27" should either be 4K or 21:9 1440P.

I can relate. Went from 27" 1080 to 1440 and it seemed like a big step up in fidelity. Though the change from va to tn was not imo.
 
To be fair if you're sat more than a meter away from a 1440p 32" screen a resolution bump isn't going to be noticeable to an average person (IE with 20/20 or corrected vision) besides the anti-aliasing side effects and the like. From my personal experience using 32" screens as monitors I'm not too sure you'd want to sit much closer than a meter away but that's a preference thing I guess. But yeah, 1440p@32" is only about 10-15% higher PPI than 1080@27 (Or exactly the same as 1080@24").
 
Depends on how dark your environment is. In a dark room you can get away with 300-400 nits, if it's day you'll probably need some blackout blinds though.
 
Depends on how dark your environment is. In a dark room you can get away with 300-400 nits, if it's day you'll probably need some blackout blinds though.

And that is why I dont think this product is good. I should not have to change my environment to make an already expensive product.. work like it should.
 
What I mostly game on these days is a 4K 43-inch TV. I can see a huge difference between 1440p and 4K there.

I have used 27/28-inch 4K screens before and IMHO that screen size is too small for 4K to be worth it. At least for gaming. At 27-inched 1440p is the sweet spot, at least for my use cases.
 
1440P 16:9 at 32" is like having 1080P at 27", It's not that nice, I know as I did have a Samsung Freesync 2 32" 1440P monitor a few months ago, Anything over 27" should either be 4K or 21:9 1440P.

This is not correct.
The above Asus 32" 1440p has ~93 pixels per inch (DPI), +/- same density as a 24" FHD monitor.
This is much higher than a 27" 1080p monitor, which has only ~82 DPI.

Of course, the pixel density in the Asus 32" is lower than a 27" 1440p (109DPI), but is not bad as a 27" FHD.
 
94% P3 coverage is fairly good, but 300nits is like normal monitor brightness, however it does state it's only a DisplayHDR 400 monitor.
 
Depends on how dark your environment is. In a dark room you can get away with 300-400 nits, if it's day you'll probably need some blackout blinds though.

Not really. Half the point of HDR is having super bright colors. It helps distinguish between colors and the difference in white and blacks. Yeah sitting close isn't smart with a bright TV but it certainly won't look nearly as great as a TV with proper HDR as it'll be limited in it's brightness output. Not saying it won't be bad but I'm also saying it won't be worth the extra expenses this monitor will bring
 
This is not correct.
The above Asus 32" 1440p has ~93 pixels per inch (DPI), +/- same density as a 24" FHD monitor.
This is much higher than a 27" 1080p monitor, which has only ~82 DPI.

Of course, the pixel density in the Asus 32" is lower than a 27" 1440p (109DPI), but is not bad as a 27" FHD.


You over analysed it, Have you tried 1080P at 27" ? It's a mess, The same as 1440P at 32", It's a mess.


Anything over 24" needs to be 1440P and anything over 27" either needs to be 4K or 21:9 1440P otherwise it just looks too blocky.
 
Not really. Half the point of HDR is having super bright colors. It helps distinguish between colors and the difference in white and blacks. Yeah sitting close isn't smart with a bright TV but it certainly won't look nearly as great as a TV with proper HDR as it'll be limited in it's brightness output. Not saying it won't be bad but I'm also saying it won't be worth the extra expenses this monitor will bring

1. A high dynamic range is just that. You can either reach that range by allowing standard-ish brightness monitors to go darker(OLEDs generally take this approach, but local dimming techniques have made it prolific on other types too), or by taking normal brightness monitors and allowing them to get brighter. Technically both these approaches can offer an accurate high dynamic range, as long as the environment suits them.

2. Luminance(Nits) is not a measurement of perceived brightness(By that I technically just mean Brightness, given it is inherently perceived and not a fundamental physical property). Wide colour gamut displays look brighter than ones with smaller gamuts and massively impacts the brightness at a given luminance. On a wider colour gamut display like this, it will look significantly brighter at 400 nits than a standard gamut or lower display at 400 nits.

Comparing the luminance of displays with vastly different spectral profiles is like comparing the clock speeds of processors with vastly different architectures (Pretty pointless from an end user perspective).

Of course similar could be said for the debate above, where anyone sat more than a meter away from their screen would probably think Dice iss making things up, and anyone sat within 75cm would find what he says to be exactly correct. That 25cm genuinely makes a world of difference to the average 20/20 human as a little beyond a meter even a 1080p 27" display has little more to offer from resolution bumps.
 
Last edited:
1. A high dynamic range is just that. You can either reach that range by allowing standard-ish brightness monitors to go darker(OLEDs generally take this approach, but local dimming techniques have made it prolific on other types too), or by taking normal brightness monitors and allowing them to get brighter. Technically both these approaches can offer an accurate high dynamic range, as long as the environment suits them.

Yes but this monitor is not capable of OLED level blacks, so it's irrelevant. The contrast ratio is only stated at 3000:1, a decent HDR monitor is around 20,000:1.

2. Luminance(Nits) is not a measurement of perceived brightness(By that I technically just mean Brightness, given it is inherently perceived and not a fundamental physical property). Wide colour gamut displays look brighter than ones with smaller gamuts and massively impacts the brightness at a given luminance. On a wider colour gamut display like this, it will look significantly brighter at 400 nits than a standard gamut or lower display at 400 nits.

Comparing the luminance of displays with vastly different spectral profiles is like comparing the clock speeds of processors with vastly different architectures (Pretty pointless from an end user perspective).

But it doesn't work like this in games. If you have a monitor with a low peak nit capability you have to set paper white nits value to very low to see any dynamic range meaning that it will look darker than SDR, or, you keep it a high, maintain good physical brightness but will experience clipping almost immediately in the high range. This basically renders the WCG useless as it will all be tonemapped to white before the display receives the data. A WCG doesn't necessarily increase perceived brightness either, in fact a WCG can have the opposite effect as a more saturated colour is generally perceived to be darker than a desaturated one of equivalent value. You can test this by playing around with HSV colours. What it will give you is a richer image.

This monitor will probably work better with SDR signals where the display itself attempts to create a WCG with a fake gamut expansion at a decent brightness level.
 
Back
Top