AMD is NOT limiting colour depth on HDMI 2.0

WYP

News Guru
In contrary to what some online publications may be telling you, AMD is NOT limiting HDR colour depth when using HDMI 2.0, with the HDMI standard itself being the limiting factor.



Read more on HDR and the HDMI standard.
 
Last edited:
Didn't realise we had Hdr capable monitors out yet?

HDR is a very strange standard, right now it is only really supported on high-end £1000+ TVs.

Right now the HDR10 standard is 10-bit colour with 4:2:2 chroma sub-sampling and Dolby Vision is HDR with 12-bit colour and 4:2:2 chroma.

The main thing for HDR is all about high-end panels which can make full advantage of the extra colour gamut. This is especially true with modern OLED displays and Samsung new Quantum dot displays.
 
Last edited:
Yeah I really don't know why people even care about HDMI. Why haven't we moved on to DP? It been faster for years and already offers HDR, Adaptive Sync, and still able to transfer audio. There's not much more 99℅ the average user will need? Adapters natively will work so if everyone moved on, it wouldn't take that long to switch and we would get a larger feature that companies can take advantage of. Considering how TV companies like to market the next big thing, you'd really ponder onto why DP hasn't been raced to implement
 
Give me the choice to buy a 4K 144Hz IPS 32" HDR Freesync monitor and I'll be all over that like white on rice.
 
Give me the choice to buy a 4K 144Hz IPS 32" HDR Freesync monitor and I'll be all over that like white on rice.

you better be prepared to wait for that as DP 1.4 won't even cover that.


DP 1.3 should be enough for 100-120ish Hz 4K, with DP 1.4 being enough for 4K 96Hz with HDR according to AMD.

20112126200l.jpeg


Thankfully DP1.4 should be faster to implement than 1.3, as it uses an almost lossless compression method to almost double the bandwidth.

DP1.4 will be added to things through software, which is why Polaris and Pascal GPUs are said to already support it.


The main problem with HDR is that it needs some super high-end displays to support it properly. Any low-end TV that supports it will likely only support it partially (only the deeper blacks but no enhanced colour, or doesn't have a good enough backlight).
 
I thought it was always known that HDMI 2.0 didnt have the bandwidth for 10-bit 4:4:4 at 4k60hz?

You have to use 8-bit or compromise on either 60hz, 4:4:4 or the resolution.
 
you better be prepared to wait for that as DP 1.4 won't even cover that.


DP 1.3 should be enough for 100-120ish Hz 4K, with DP 1.4 being enough for 4K 96Hz with HDR according to AMD.

20112126200l.jpeg


Thankfully DP1.4 should be faster to implement than 1.3, as it uses an almost lossless compression method to almost double the bandwidth.

DP1.4 will be added to things through software, which is why Polaris and Pascal GPUs are said to already support it.


The main problem with HDR is that it needs some super high-end displays to support it properly. Any low-end TV that supports it will likely only support it partially (only the deeper blacks but no enhanced colour, or doesn't have a good enough backlight).

Ultrawide IPS at 144Hz with HDR, Freesync wouldn't bad ^_^
 
Ultrawide IPS at 144Hz with HDR, Freesync wouldn't bad ^_^

3440x1440p still wouldn't hit 144hz with HDR. Soo you're gonna have to be patient still:D
It might hit 120hz iirc. Which would still be awesome. 120hz is kinda forgotten for some reason. It's still more than enough.
 
the problem with hdr is that we need displays that are accurate enough to display that kind of colour, which is one of the reasons why it is so expensive.
 
The bigger issue is content supporting it. Yeah sure for games it's supported, but if no one on the TV commerical provider side will support it, it really hurts the advancement. Games are the only things that are supporting it. Not much else is, it's going to take a long time. Which means prices are going to stay high for a long time too.
 
Honestly having seen HDR in the flesh I don't give 2 monkeys about it, 3440x1440 IPS 144Hz Freesync that has nice clean looks is on my dream wishlist though :)
 
The bigger issue is content supporting it. Yeah sure for games it's supported, but if no one on the TV commerical provider side will support it, it really hurts the advancement. Games are the only things that are supporting it. Not much else is, it's going to take a long time. Which means prices are going to stay high for a long time too.

Yeah, on 4K Blu-ray it is only supported on some titles. Then there is the whole HDR10 vs Dolby Vision thing.

It is also true that a TV can be advertised as HDR without actually supporting HDR10, which is a whole flustercluck. The standard needs time to mature, with higher quality panels becoming mainstream, a new HDMI hat supports full chroma at 60Hz 4K 10 or 12 bit and a range of HDR ready content.
 
Last edited:
Honestly having seen HDR in the flesh I don't give 2 monkeys about it, 3440x1440 IPS 144Hz Freesync that has nice clean looks is on my dream wishlist though :)

You probably haven't seen a true HDR panel. Very very few they are

Yeah, on 4K Blu-ray it is only supported on some titles. Then there is the whole HDR10 vs Dolby Vision thing.

It is also true that a TV can be advertised as HDR without actually supporting HDR10, which is a whole flustercluck. The standard needs time to mature, with higher quality panels becoming mainstream, a new HDMI hat supports full chroma at 60Hz 4K 10 or 12 bit and a range of HDR ready content.

There needs to be one standard. Dolby needs to gtfo and stop slowing down technology. Adoption rates are so slow for this reason. Every company fights tooth and nail. There is a reason we have international standards for things. But still there's always that one company..
 
"We will need to wait for HDMI 2.1 or another future standard before we will get to play HDR content on at 4K with a full chroma colour sampling"

its already here, DisplayPort 1.3
 
We all know that consumers would be better off without HDMI and just DP instead. But will it happen...

Monitor makers need to drop VGA (on the cheaper monitors) & DVI for a start and just keep HDMI and DP even tho i like DP way more for a computer.

I look at it as DP = computers & HDMI = TV's
 
Monitor makers need to drop VGA (on the cheaper monitors) & DVI for a start and just keep HDMI and DP even tho i like DP way more for a computer.

I look at it as DP = computers & HDMI = TV's

Why on the cheaper monitors? Surely the whole point of a cheaper product is that its cheaper to use? From a business perspective, VGA is going to be around forever as everything supports it/can use it. You don't have all these multiple versions of it and you can convert any signal to output to it easily.

Get rid of it from more expensive screens/monitors though.
 
Back
Top