HDR not working right

Ok. I’ll have a check tonight and will report back.

Making pictures is perhaps a good idea so I took the same as robnix for possible comparison.

Photos are made by iPhone X, TV is Sony KD75X9405C. As said before Plex & Infuse switching automatically to HDR setting, MrMC not.

description/look at the picture names:

  • Plex usage is the same as “my computer” from Apple so I used that
  • By pausing on Plex the picture gets a little bit darker through the overlay, so I made one without “Pause” too
  • MrMC don’t switch on my TV so here is one with normal movie setting and one with hard switch by me to HDR setting

Feedback please :slight_smile:

To me, your screenshots from Plex without pause and MrMC set to SDR setting look the same.
The Infuse screenshot clearly shows that the HDR data isn’t displayed correctly.
The MrMC with HDR setting looks very weird to me :open_mouth:

On my side:
In Infuse, no HDR whatever the setting (forced SDR or HDR). I don’t use Plex so I cannot test that.
In MrMC : I have to manually enable HDR in the TV settings, whatever the setting (forced HDR or SDR in the Apple TV). But when set manually in the TV, HDR works as expected.

Edit: Infuse is clearly forcing HDR to SDR conversion whatever the settings on the TV or Apple TV. It’s like it detects that the TV is HDR capable, because it does switch my TV into HDR mode, but the picture itself gets converted to SDR. Forcing the TV into HDR10 mode (instead of auto mode) doesn’t change anything.

Just to confirm that I have just checked and MrMC is switching to HDR without any problems for me.

If I play a HDR file, i can see on the info page on my Denon AV, that it switched correctly to BT2020 YCBCR:24p. What i can’t see, if the HDR information is transfered too, but the TV said, it is on, so i would say, it is correct.

What is wrong is the Apple TV content switch, because it automatically uses Chroma 4:2:2 instead of 4:2:0, also if I don’t do the speed test. The info page shows — > — for color depth, where 10 bit would be correct. I have to admit, that i don’t see a difference, so i don’t care, but this is something, that the Apple TV is doing wrong. :wink:

4:2:0 at 4k 24p is not part of the standard HDMI spec unfortunately. So the Apple TV switching to 4:2:2 when frame rate switching is on is expected and correct.

You think, the 24p is “the problem”? Well i am just wondering, why Denon can’t show the color depth. I don’t have the problem, if i play UHDs (also 24p) from my Samsung player. It shows 10bit in normal mode and 12 bit as enhanced color. Don’t know, what Apple is doing here that just — > — come up. If i force the Apple TV Desktop to 60hz (no movie play) to HDR Chroma 4:2:0, it shows the correct 10 bit. If i change to Chroma 4:2:2, its —> — again.

I will ask Denon, what this means :wink:

Quick update.

We’re tracking a few potential leads which could help us resolve a few issues with HDR that are present with some video and panel combos.

More news to come soon.

Thank you again James, I know we all appreciate your efforts towards this!

Just to add my part.

Here I have an HDR video that plays:

  • perfectly fine on my (terrible) Philips Android TV’s media player,
  • perfectly fine on the Apple TV 4K using iTunes homesharing and the “Computers” ap" (the player has terrible performance they do not seem to care about, skips frames, but the images and their colors are perfectly good),
  • terribly on Infuse (I tried the original 5.6 release as well as the latest 5.6.4 beta : the dark parts of the pictures are overblown. I also tried switching the Apple TV’s video settings to SDR or HDR with 4:2:0 but the image was still wrong.

I have uploaded a 1 minute .m4v sample called DHDRXE.m4v.
If you look between 10" and 20", you can see a lot of bad artifacts in the dark areas. At 0’18" for example, the boy’s hair is totally wrong with Infuse (and all right with iTunes homesharing on the same Apple TV 4K, or my TV’s native player)

Hope this helps finding what’s wrong!

Another quick update.

We’ve done an enormous amount of testing in the past week, and feel we’ve isolated the few cases where this can happen. This doesn’t seem to be totally specific to Infuse, as a few similar reports with Netflix have also been floating around.

Nevertheless this is something we feel we can improve, and a number of fixes and improvements are currently being implemented. We hope to have these available in 5.6.5 (5.6.4 is already pending review at Apple), and more news will be available soon.

Rest assured, we’ll ultimately get this resolved so HDR works great no matter what video you are playing or what screen you are viewing it on.

Thanks for your patience and understanding.

Thanks James for your update!

The sample I sent, from the video which has issues in Infuse, has pretty high maximum light levels.
So my wild guess would be that Infuse does not take these high levels correctly into account.
And it is a good thing that the Apple TV 4K’s native player handles it well.

Keep up the excellent work!

Here’s the file HDR10 metadata from MediaInfo
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : R: x=0.680000 y=0.320000, G: x=0.265000 y=0.690000, B: x=0.150000 y=0.060000, White point: x=0.312700 y=0.329000
Mastering display luminance : min: 0.0001 cd/m2, max: 4000.0000 cd/m2
Maximum Content Light Level : 4000 cd/m2
Maximum Frame-Average Light Level : 1000 cd/m2

James, thanks for all the hard work and the constant updates!

Can someone please correct me if I am wrong?

My understanding is that, if you play an HDR10 video to an HDR10 TV, all the player has to do (I am in no way implying this is easy!) is :

  • to decompress the video (which is deterministic: there is only one way to do it right),
  • to pass the video data and the HDR10 metadata to the TV, which is equally deterministic.

So, if the TV’s settings are the same, the players should produce exactly the same picture on one given TV, whatever its brand and model are.
In particular, if playing from the same Apple TV (therefore to the same TV and through the same HDMI input, with the same TV video settings), Infuse and teh Apple TV’s native media player should produce the exact same images.

Therefore, if Infuse and the native Apple TV’s media player play the same video differently, one or the other is doing something wrong with the video data or metadata.
I understand that this could be more or less visible depending on the content, TV model and settings.

So, since one HDR10 video plays well on the Apple TV 4K to an HDR10 TV using the native player, but differently (and badly) with Infuse,
Am I wrong thinking that this implies Infuse is in some way messing with the video data and/or metadata, when it should just pass it to the TV?

Thanks in advance. Just trying to help :slight_smile:

I was evaluating Infuse 5 Pro in my Apple TV 4k box. I am especially interested in the playback if h265 HDR content. When playing back a UHD BluRay Remux (I own the UHD BluRay as well as the iTunes version). I noticed that the highlights were noticeably dull compared to the actual UHD BluRay or the iTunes version played back on the same device. The same remux played on an NVIDIA Shield using Plex looked exactly like the source and the iTunes version. I also checked other HDR titles and they all show the same issue of dull HDR highlights. I tried to check if the Apple TV switches to HDR mode (match content to on) and it does, but somehow it seems Infuse is clipping or tone mapping the image. Which is not cool or should be an option to opt-out of.

Yeah that’s pretty much what I’m seeing. HDR definitely seems ‘reigned in’ when compared to playback on MrMC

Same here. Image often seems alot dimmer then when i play it using tv’s mediaplayer.

MrMC isn’t sending the HDR data correctly either, I checked again this morning after I saw there was an update.

It seems to be ok for me. Maybe we have different issues?

Have you compared them to the same image using a UHD player?