You completely missed the point. This is not about playing 4K content on non-4K displays, it’s about HDR to SDR conversion, and it’s not something that I demanded, it’s a feature that Firecore announced in their own blog as a new feature in 5.6. And it does not work as advertised.
i’ve discovered issues with 4k hdr color tone on a 4k hdr display, i think that is more important than playing files not created for use on 1080p tvs.
You’re using a 1080p display. So, yes…this is about playing 4K content on non-4K displays. As such, this is also about playing HDR content on displays only capable of SDR.
You do realize there are many 4K screens out there that do not support HDR at all?
I’m really wondering why this thread is being hijacked constantly by folks who’d rather discuss whether the use-case makes sense or not to them.
If the use-case does not make sense to you, please move on.
Please provide a sample of content you are using on your 1080p display which utilizes HDR, but is NOT also 4K.
Are you just not interested in reading what’s actually written in posts you’re replying to? Let me repeat:
There are many 4K screens out there that do not support HDR. For this purpose, and some more, Infuse supports HDR to SDR tone mapping (if it weren’t buggy).
Again, why do you feel the need to hijack this thread?
You are the only one suggesting there is an issue…and you don’t even have a 4K display to test the issue with. You are not only testing HDR content on a display incapable of HDR, you are testing 4K HDR content on a display incapable of both 4K and HDR. Do you have any evidence to support the idea that this feature is not working for those with non-HDR 4K displays?
Provide the brand and model of the display being used. Provide the technical data of a file being tested.
I’m testing the feature exactly as advertised by Firecore here:
“Not only can HDR videos be played in their full glory on an HDR screen, we’ve also baked in dynamic range conversion so you’ll get great color even when using older, non-HDR devices.”
And I’ve sent James many weeks ago a sample to replicate this issue, and many weeks ago he acknowledged that tone mapping is not working as intended.
Will you now stop hijacking this thread with irrelevant posts?
Provide the brand and model of the display being used. Provide the technical data of a file being tested.
I have uploaded a sample to Firecore many weeks ago, it’s from my Mad Max: Fury Road retail disc. 2160p24 HDR 10bits BT.2020. Same problem with Blade Runner, 2160p24 HDR 10bits BT.2020. Star Trek Beyond, 2160p24 HDR 10bits BT.2020. Of course when someone claimed that they do not see a problem with Blade Runner in SDR, and I asked which SDR display that was, nothing came back.
TVs where tone mapping is not correct and which I have personally tested:
Sony KDL-65W855C
Samsung J6299
Panasonic TX-49ESW504S
Panasonic 55ETW60
LG 55LH604V
Of course I know that I’ve wasted another 5 minutes of my life because you’ll be doing what you did to this point, derailing the thread by claiming that none of this is an issue because none of the above proves anything to you and that the problem surely is on my end and the wide range of manufacturers’ SDR displays, or that this issue is of no consequence because no one on this planet would be trying to watch a HDR movie on a non-HDR display, despite Firecore explicitly advertising this exact scenario in their own blog.
Why am I wasting my time …
I found your problem (/sarcasm). You’re trying to test 4K HDR content on a 1080p display. The verbiage they actually used does not support the kind of activity you are seeking:
Infuse 5.6 - A fresh new look + HDR | Firecore
Not only can HDR videos be played in their full glory on an HDR screen, we’ve also baked in dynamic range conversion so you’ll get great color even when using older, non-HDR devices.
It doesn’t say anything about playing 4K HDR content on a 1080p display. You are assuming that the mention of “older, non-HDR devices” applies to your scenario.
Only you can answer that. Perhaps you should upgrade one of those displays and try again. You appear to be the only one encountering this perceived problem.
i think that refer to non HDR but 4K capable displays, it’s pointless to use 4K files on 1080p display, period.
I am quite surpised that so many people on this site are still quite confusing display resolution (e.g. 4K/UHD vs 1080p/HD) with color rendering (e.g. HDR/10bits vs SDR/10bits/8bits)
Those two aspects are totally different and unlinked despite all vendors and manufacturers can say. One could perfectly have an 1080p/HDR display as long as the internal treatment and display would be performed on 10bits. For example EIZO has manufactured 1920x1200 photo displays with 10bits internal treatment and panels years before the appearance of HDR and 4K (the issue was then that the majority of video cards were only sending 8bits signal, but that is another subject)
So you are not crazy, I also am waiting for Infuse to correctly manage tone mapping from HDR to SDR to be able to enjoy better colour mapping on my 10bits_but_only_HD videoprojector.
As already mentionned, there is no technical link between HDR/SDR and UHD/HD, only commercial link.
If Firecore states that they provide a feature for tone mapping from HDR to SDR, that is irrelevant to associate it with UHD/HD.
Thus tone mapping should work exactly the same on UHD display as it should on HD displays as long as internal treatment is performed in 10bits.
Not even sure that all UHD panels sold with “HDR” capability are even all capable of 10bits…
Again no. Just because you’ll then benefit from the higher bitrate of the UHD compressed with HEVC compared to the HD. Both are lossy compression algorithms but at an even file size, HEVC will be far better than MPEG4
For the same file size you’ll have a better quality using HEVC even if downscaling (which is far better / easier than upscaling) to HD therefore the quality should be better than an average / low quality HD encoding.
Absolutely no one is confusing resolution and color rendering. No one. There are zero 1080p displays capable of HDR. Zero. There are zero 10-bit 1080p consumer television displays. Zero. HDR and 10-bit are tied together. The person in question (the ONLY person complaining about this) is attempting to play a file that is entirely incompatible with his display in more way than one. Period. End of story.
…with the displays he’s using (he listed them), he won’t be able to distinguish any gain in quality.
Just perform a quick search on “10 bits 1080p panel” and you’ll see that this assertion is incorrect.
Some of those panels where existing long before HDR. Not cheap nor accessible to everyone at that time, but still.
Those panels would benefit from a correct HDR to SDR mapping, because they are just not capable of full HDR due to their low nits capability (and probably colot gamut but this is another problem)
In one way only. HDR needs 10bits.
With 10 bits you can do HDR or SDR or HDR-to-SDR tone mapping.
That is another problem…
…and if you do the same search you’ll see that those are specialized computer monitors. Once again, I will attempt to make this simple. The person in question is trying to watch a 4K HDR file on an incompatible 1080p display. Absolutely nothing about any updates to Infuse has made this possible, nor was it ever suggested. The complaint lodged by that person is completely invalid. He is NOT using a 10-bit television display. He is NOT using a 4K display. He is NOT using a display capable of HDR.
After a 5 minutes search : Sony KDL-40W3000
10 bits processing / 10 bits panel.
released 2008. not the only one either from sony or other manufacturers.
So perhaps his tvs are not adequate, but the purpose of this thread is the possibility offered by Firecore to display UHD/HDR on HD or UHD / SDR devices properly after a tone mapping. (so by construction on a non-HDR capable display whether it is a HD or a UHD device)
Unfortunately, I think that we have reached here an irrational point so I’ll leave it from now on.
The fact is I understand what he expects, and I think that this is as relevant as any other specific request regarding that this is something proposed by Firecore.
HDR <> 10 Bit. AppleTV outputs just 8Bit for Dolby Vision, as example, and it is still HDR image. BT2020 vs Rec709 is the question here.
This is wrong. This HDR feature came with 5.6, and for me, it is working very nicely. Perhaps not every color is 100% accurate, but better as without this feature.
Firecore description: “Not only can HDR videos be played in their full glory on an HDR screen, we’ve also baked in dynamic range conversion so you’ll get great color even when using older, non-HDR devices.”