I found a strange picture quality problem when using the Apple TV 4K with Infuse. There seems to be sort of an image processing in the Apple TV that causes weird artifacts.
It is visible for example in special test patterns (I am using the patterns from “Diversified”). Attached picture 1 shows the test pattern correctly displayed via a Blu-ray player (ts-file via network). Pictures 2 an 3 show the same pattern as being displayed with the Apple TV 4K. Note the dark stripes in the red, green and blue color gradients.
The MrMC player generates the same artifacts. Seems to be an internal processing problem that is independent from the software player.
What can this be? Is it possible for the Infuse developers to turn off this internal processing?
Tested it and get the same strange results with MrMC and Infuse.
As it is an HDR file it could be something to do with the HDR flag!
Reason for my opinion is that playing these file with Plex on the ATV4K results in playing in non HDR ! and fault free!!!
It plays in cinema home mode on my tv and by manual switching to HDR the fault doesn’t appear!
@james: perhaps it helps you a little bit by finding the solution for infuse.
HDR watching is like a banana…matures by the customer
I’m also seeing this on an LG C7 / B7. Using the TVs own player when playing the file from an USB stick does not show the stripes in the RGB bars, only when using Infuse. This might be same issue from this thread: HDR not working right.
Hopefully it will get adressed in the upcoming version 5.6.7 which has HDR related fixes. I will test it again when it’s released. Thanks for providing the sample file!
Here I observe the same thing. But something is off though. Just look at brightness pattern on my c7 it show info way under reference black, although this not translate to content. One interesting thing is that when viewing the contrast pattern there is no clipping at all. Detail is shown up to 940 which would be 10,000 nits, which is a waste since no content gets above 4000. But considering the awkward behavior of the brightness pattern I am not sure whether we can rely on them. If we can there is a ton of work ahead for the firecore team. But they are looking into it so I hope that will translate into a working solution.
I don’t think it is an issue with Infuse, more with the tvOS Codecs/drivers. I tested it with Infuse and Kodi and on my Bravia TV. With the TV internal player it’s just fine. With Infuse I get the same noise of the OP, while with Kodi it looks fine, but Kodi doesn’t switch the TV to HDR (I use the Range Match). Switched the ATV 4K to always 4K HDR and also Kodi shows the noise.
Yeah, I belong to Appleseed, so I have just opened a Bug Report. I also added the 24FPS demo video in another thread that has the bars flashing instead of playing smooth. Hopefully it will get fixed soon…
This is a very serious problem and is all in the hands of Apple.
The Vegas scenes in Blade Runner 2049 look just awful and have very severe banding (see attached screenshot).
I am talking about Blade Runner 2049 in 4K HDR10, purchased on the iTunes Store!
And I can confirm that the test pattern looks awful on the Apple TV with any app you can use, including the Apple-developed “Computers” app.
Here is the test pattern remuxed so that you can add it to iTunes then play it on the Apple TV 4K using homesharing: Dropbox - Error
Yes, the “Computers” app skips frames - I submitted a bug report to Apple about that several months ago but they do not seem to care- but you can still see that the rendering is all wrong.
I have no idea whether this is specific to HDR10 or also happens with Dolby Vision video.
I confirm this. I’ve seen awful banding, like in your picture from Blade Runner 2049, in a Norwegian TV series too. The scene showed a lit up night sky. It should be beautiful, but looked weird. This is just not good enough. Not at all.
No it is not.
The media from the UHD Blu-ray have the same issues when played on the Apple TV with the “Computers” app, Infuse or MrMC.
They play without issues on my Android TV’s media player or Kodi on the same TV.
I made another test, using Netflix, which is available in 4K HDR on my Android TV, Apple TV 4K and PS4 Pro.
I used episode 1 of Godless, around 32’25", where there is a blue sky.
I used the exact same image settings, and connected the Apple TV 4K and PS4 Pro to the same HDMI input of the TV. (I also tried changing every single TV image setting but that did not eliminate the issue)
The image with the TV’s media player is perfect, the one from the PS4 Pro has some light banding, the one from the Apple TV 4K has terrible banding.
The Apple TV 4K is set to output 2160p HDR10, with frame rate and dynamic rage matching (so in that case it is 2160p24 HDR10, since my TV supports HDR10 but not Dolby Vision), with 4:2:2 chroma subsampling (using 4:2:0 chroma subsampling did not make any improvement)
With the Apple TV 4K set to output 1080p HDR, the image is perfect. It also appears very correct in 2160p SDR.
@Bigbertha which are your Apple TV’s output settings? I am about to send you my test file in PM.
I found a temporary workaround to the severe banding (but the test patterns still do not look right)
After resetting video settings on the Apple TV, it works, and I no longer have severe banding, be it the outdoor Vegas scene of BR2049 or the Netflix Godless episode I previously tested.
The default video settings for the Apple TV 4K after reset are for my TV 2160p60 HDR10 with 4:2:0 chroma subsampling/
If I just choose 4:2:2 chroma subsampling (which seems to work and is supposed to be supported by my TV), severe banding is back. And it stays even if I switch back to 4:2:0 chroma subsampling !!That is obviously a bug on the Apple TV’s side. Luckily it goes if I reset video settings.
If I choose 2160p24 (or p25 or p30) HDR (or if I let frame rate matching switch to 2160p24), severe banding is back. But id I switch back to 2160p60 or 2160p50 it goes away.
Luckily my TV can play 2160p24 at 60HZ without any stutter
My wild guess is that at 2160p25, p25 or p30 HDR10, the Apple TV tries to push 12-bit video (because it can do so without chroma sub-sampling according to my understanding of the HDMI 2.0 standard https://www.hdmi.org/manufacturer/hdmi_2_0/hdmi_2_0_faq.aspx#146 ) and that some TVs handle that badly.
I read in a review that mine had issues with UHD Blu-Ray players outputting 12-bit video: severe banding.