Apple TV Settings for UHD HDR Movies

Hello everyone!

I‘m just wondering which Settings are best to watch UHD HDR10 Content via Infuse.

Should I select YCbCr or RGB high or low? And which Chroma setting is right: 4:2:0 or 4:4:4?

Of course I want the best picture quality and HDR presentation. But I‘m confused with the settings. I‘ve got a LG B7 by the way.

YCbCr 4:2:0 is what movies are done in. Personally I use 4K SDR YCbCr 4:2:2 or 4:4:4 (both Match settings to ON) with my B7, this “may” help with some gradiations, but is not needed. Use 4:2:0 as the base, if you want to try 4:2:2 or 4:4:4 then there is no harm, but if you start to notice anomalies, just go back to 4:2:0.

Thank You!

But strangely I just can choose between 4:2:0 and 4:4:4. there is no 4:2:2?

And i run the Apple TV in 4K SDR. Is that okay?

1 Like

You should choose an HDR format, which is only available is your TV is HDR compatible.
Possible chroma settings are different for each resolution and HDR/SDR.

You shouldn’t really keep 4K HDR as the default setting! if you have hdr on all the time it can ware out your panel over time. I have mine set to 4K 60fps SDR chroma 444, match dynamic range on and match content On.

While turning match setting both on, HDR10 content will be up scaled to 12 bit video with chroma settings on 4:2:0 and 4:2:2, can someone confirm this? When I play a HDR10 movie (uhd remux) the picture is showing some banding, unless I reset the video settings and choose for match dynamic range only, with the Apple TV in 60hz SDR.

Match settings is not really the point here. It will only make sure that, for example, if you play a 24p HDR video, the Apple TV’s output will switch to 24p HDR, regardless of the format you have selected in the Apple TV’s video settings.

If you leave match settings off and select the resolution manually, you get the exact same result.
Since all movies and almost all TV shows are in 24p, when you watch one that is HDR, with match settings on, the Apple TV will switch to 24p and HDR. (I’ll just assume you are using 4K as a resolution).

And yes, at the moment, Apple TV 4K always outputs 4K 24p HDR in 12 bit, be it with 4:2:0 or 4:2:2 chroma subsampling. This does not make sense to me since they could use 4:4:4 in 10 bit or 12 but within the same bandwidth range, so with zero compatibility issue, and slightly better quality.
And 12 but video creates issues for some TVs and receivers, with zero quality benefit for HDR10, which is encoded in only 10 bit anyway.

For info, I checked the Apple TV 4K’s output format with a HDFury Linker (which I no longer have).

I posted about this issue there too: HDR not working right

Is apple aware of this issue? I am not a developer so I can’t create a ticket for this problem. It’s annoying, maybe they can simply create a toggle for pure HDR10 output. If the Panasonic and Oppo can do it, so can Apple. EDIT, I’ve sent a feedback form to Apple using the template on their website.

Is putting the ATV in sdr mode without dynamic switching advisable? In other words, is the ATV good in hdr to sdr conversion?

No, if you have a HDR TV you get a better picture with setting the Apple TV to SDR and dynamic switcrying enabled. The most recent tvOS beta has problems with HDR to SDR conversion.

Even when the ATV outputs 12 bit video when watching a Hdr uhd remux file? I have a B6 from lg.

I submitted a bug report a few weeks ago so Apple is very much aware of this.
Whereas, when the first HDR UHD Blu-Ray players appeared, there was a lot of traction to this issues (with a successful petition to have 10 bit output on Panasonic UHD Blu-Ray players), very few people seem to care today. I suspect the reasons as the following:

  • early adopters of HDR equipment were probably more sensitive to potential image quality issues,
  • the issue seems to only affect some 2015 and 2016 HDR TVs - so anyone with a 2017 o 2018 TV is unaffected,
  • the issue appears as a big issue for some specific scenes - I have to say I only realized something was really wrong when watching Blade Runner 2049.
    And it is not only the Apple TV 4K that outputs 12 bit when it could output 10 bit:
  • my PS4 Pro also outputs 12 bit in HDR, including for 4K 60p - but since I only use it for games I am not that annoyed,
  • the supposedly infinitely customizable Nvidia Shield TV has only one option for 4K 24p HDR, and it is 12 bit.

So I hope Apple will do the right thing, which is (as far as my understanding goes) to have a 10bit/12 bit switch instead of the current chroma subsampling switch,which could also make the Apple TV 4K video settings less convoluted, but I won’t hold my breath.

For now, if you are annoyed by this, there is no clear solution (use 24p with the banding issue, or 60p and suffer from stuttering on your TV).
I have found a solution that works without buying a new TV but is not cheap and requires a bit of set up: I now have a HDFury Vertex between my Apple TV and my TV, which converts the Apple TV’s output to 10 bit.(the cheaper HDFury can do that too, but due to its software, it works for 24p or 60p, but not both with the same settings)
It seems to work well but I have only had it since yesterday.

Thanks for this information. I don’t experience stutter issues when resetting video settings and only match dynamic range on in SDR 60hz. I can smoothen motion out with the truemotion feature on my B6, but the picture quality is more clear compared to match dynamic range and frame rate settings. For now I just leave it like that until Apple fixes this.

Yes. 10bit to 12bit conversion properly is just padding 0’s. I don’t have an issue with my C6 or another LG LED.

Absolutely. And TVs should be able to understand 12 bit and just remove the excess 0s.

Yet some 2015 and 2016 TVs(including mine), and some receivers, have issues with 12 bit input formats :frowning:
According to the HDMI spec, you can only use 12 bit for the better chroma subsampling options, which is why Apple does it.
You can see HDMI 2.0 supported formats here https://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#146 .

On a side note, it seems slightly odd that a LG B6 would have banding issues with a 12 bit input format and a LG C6 would not.
@IMWhizzle, do you have a receiver between your Apple TV and your LG B6 ?

Yes I do, it’s a Marantz 7011. I’ve disabled video conversion on it.

If I were you, and if you haven’t done it yet, I would try plugging the Apple TV directly to your LG B6.
Just as my TV should not have issues with 12 bit video, it is possible your Marantz receiver somehow does something that it shouldn’t, even with video conversion turned off.

Yes, but if I do I will lose the HD audio sound of movies. So it’s not an option. Something is wrong with the colors too when enabling match settings and frame rate settings. HDR content is much more vivid and clean (without banding) in the match dynamic range only setting in 60HZ SDR (or SDR) with 4:2:0 chroma.

Until Apple fixes this I’m using these settings. For now the motion handling on the B6 is fine with Truemotion off and Real Cinema on.

Ok. I was just mentioning testing, to understand where the issue comes from, not to permanently lose HD audio and such.
The important thing is that you are happy with your setup :slight_smile:

If the issue is with the receiver, and if Apple does not give a proper solution, and (one day) you want to remove the limitation, there is a product fril HDFury (and maybe from other companies, I have no affiliation with HDFury but have checked their products looking for a solution), the AVR key, that splits one HDMI input into 1 HDMI passthrough to your TV, plus one HDMI with the full audio for your receiver. So you should get HD audio plus the same video as if your Apple TV was plugged directly to your TV.

Hi guys,

I’m experiencing the same issue with a LG 55B7V. ATV 4K works like charm using Dolby Vision (watched Blade Runner 2049 on iTunes Movie service) but switching to HDR10 results in horrible banding!! The same file (Blade Runner 2049) is perfect streamed using LG built-in media player and banded using infuse on ATV4K. Also Amazon Prime video (which use HDR10 instead of Dolby Vision) on ATV4K is banded so it’s not a Infuse bug, nor a HDMI cable issue because as said before iTunes Movie gives great picture quality!!

Hope apple is going to fix this… but in my opinion also LG should check this.

Is your Apple TV connected directly to the TV, or is something (like a receiver) in between?