Image quality issue (compared to Shield TV)

Hi!

I’ve been using a Shield TV for more than a year now, which had it’s fair share of issues with colour space conversion and horrible upscaling.
Though some of it is better now (colour space conversion) or fixed (resolution switching) I just bought an Apple TV 4K because of some few other annoyances with the shield.

I’ve been trying out Infuse and MrMC, but both have the same issue where red / blue colors don’t look as vivid as on the Shield.
Using the same settings (4k 4:2:0), Colour space set to “Auto” on the TV. The attached screenshots (Blade Runner 2049) are from MrMC, but looks the same on Infuse too.
When I set Colour space no “Native” the colors look closer to the shield, but way more oversaturated, so I don’t think that’s correct.
I tried 1080p on ATV, disabling resolution switching on the Shield, the differences are always visible.

Any ideas which player is actually correct, or what could be different? (I have a feeling that the Shield oversaturates colors…)
This only happens for my 1080 bluray rip sources, ranging from 10-30 gigs.
4k HDR movies look the same on all players (Kodi, MrMC / Infuse, MrMC)

TV: Samsung KS7000, same settings on both sources.
Shield TV: Shield Experience 7.0.2 (4k, 4:2:0)
Apple TV: tvOS 12 beta (4k, 4:2:0)

1_atv_4k.jpg

1_shield_1080p.jpg

Did some more testing with the TV’s native Plex player. (attached the screenshots from Infuse and native Plex on “Auto” color space)
Rec. 2020, 4k, “Auto” for Kodi on the Shield looks like “Native” for the other 2 players.
I changed the Colour Space to Rec.709, 1080p on the Shield, set to “Auto”, and Kodi now looked just as TV’s native Plex on “Auto”.

So, there are two possible answers:

  • Native Plex and Shield in Rec.709 is correct, Shield’s 709 → 2020 mapping is wrong, but not bad, but Apple TV is doing something wrong with 709 sources?
  • Infuse’s 709 handling is correct, Shield’s 709 → 2020 mapping is wrong, but not bad, Native Plex and Shield in rec 709 is wrong?

Even the calibration color bars in the settings look washed out. Any ideas?

auto_atv_0.jpg

auto_plex_0.jpg

It’s a little hard to compare color output between apps and devices because of the numerous factors at play, but we are pretty confident in the color accuracy provided by Infuse. FWIW, the Shield wouldn’t be the first device we’ve seen to oversaturate colors (intentionally or otherwise).

There is some discussion about YCbCr having issues on the Apple TV in 11.3 and later, which would affect all apps. Some feel switching to RGB High provides a better result, and more info on this can be found here.

I haven’t seen any difference between ycbcr and rgb. However, I’ve tested various other sources (youtube, photos) with various devices (ipad 2018, sony xz1 compact, IPS PC monitor) and color saturation on all of them resembled Apple TV the most. So I guess the answer is the second one, Samsung TV over saturates everything by default in Rec. 709.

(Though shouldn’t the Apple TV have the same output if it switches the color space to 709 properly as I’ve read it does?)

Thanks for the answer!

Ycbcr on ATV has problems with near blacks up to 5% and adds green tint along the whole greyscale. It’s not something you see easily, but it is evident in black skin tones.

The output of the Shield is definitively wrong. Compare the surrounding area of Gosling’s character in the center. The Shield is oversaturating reds to the point where the ground is reduced to a couple of red hues, while on ATV you can see smooth gradients.

Also, please be aware that you’re likely watching the screenshots in this thread here on a computer display. I am looking at them on a i1-calibrated, true 8-bit display with 10-bit FRC, but if you were to look at them on a cheap laptop display with 6-bit panel and default Windows ICC profile and default GPU settings then the smooth gradients on the ATV might not be as visible, or the difference in fine detail resolution there might not be visible at all other than the Shield version looking more vivid.

ATV 4K sends out “fabricated” HDR10 metadata for content played in Infuse and other similar software. It replaces the HDR parameters present in the original content with a generic HDR InfoFrame (P3, D65, Max/Min Mastering Display Luminance:1000/0.005nits, MaxFALL/MaxCLL: 4000/1000nits). Shield outputs metadata that matches the content (except for MaxFALL/MaxCLL which the Shield doesn’t include in the HDR InfoFrame). If you have a display that adjusts its tonemapping based on luminance metadata in the InfoFrame, you WILL see a difference between the outputs of ATV 4K and Shield.
Based on my tests reading the actual pixel values in the signal, I can say with high confidence that Shield’s 10-bit output is nearly flawless. There are some very minor LSB errors which are likely due to differences in rounding done by the Shield and testing instrument.

As for BT.709 to BT.2020 gamut mapping is concerned, Shield perfectly follows ITU BT.2087 recommendation. Any difference in color between native BT.709 and mapped BT.2020 output has to be attributed to the display or its settings.

Which UHD titles have you used to verify this? I’m asking because the HDR metadata that’s present in many titles is not actually correct, i.e. during mastering they just set some average values, or use a heuristic to derive them, e.g. by sampling every 10th frame or similar.

Also, many UHD players and TV sets ignore those values anyway and instead choose to “optimize” PQ - for better or worse - on-the-fly, acknowledging that the master isn’t perfect, anyway, and that’s why there’s no harm in applying their own heuristic by analyzing actual picture content instead of relying on HDR metadata. IIIRC the very first UHD discs had generic HDR metadata embedded so it wouldn’t look “good” on a high-end TV, which I guess was the primary driver of the manufacturers choosing to ignore the values.

My comments are based on test patterns. I wasn’t making inferences based on UHD titles. HDR10 metadata is static. The only two values that come from file analysis are MaxFALL and MaxCLL. The rest are based on the display used for grading. Not much can be done on titles that have “bad” HDR10 metadata. You can only apply your own artistic intent to such titles and make them look “good”.

Did you verify the HDR metadata sent from ATV with a HDFury device?

Yes, but that’s only one of the several test instruments that I have used. I personally own several field HDMI analyzers and also have access to industry reference grade HDMI analyzers and waveform monitors.

1 Like

Here is a little bit more infor regarding the HDR metadata issue weks05 mentioned.

Did you check the PQ of ATV+Infuse? If so, what say you?

Does ATV also send the generic values when playing back iTunes 4k titles? If so, I’d figure that Apple is very likely not re-encoding all titles (from the masters they surely have access to) to these generic values, but TV sets are very likely ignoring the values, anyway.

It’s the actual with iTunes, Netflix, Amazon Video etc. With Netflix, it’s even dynamic. The values are different from that seen on retail discs. Many of the 2018 models seem to take into account these parameters for tonemapping.

1 Like

Not any recently, I have made some comparisons in the past. I have also commented on the YCbCr issue that mrrobotoplus has mentioned. Except for the 2%/98% black/white clipping, the output is fairly bit accurate. Chroma upsampling and picture upscaling also looked fine to me.

I just took a quick reading of the ATV4K UI, picture attached

Look at the green on YCBCR. Also, on Avsforums other calibrators came up with the same results.

FYI, we’ve managed to resolve issues with HDR & clipping for the upcoming 5.8.2 version. :wink:

That’d imply that there are public APIs that let you set HDR metadata.

Did you just test it for yourself or did you post some pictures somewhere?

It sure does. Maybe only triple A titles have access to those APIs and others will come later?