Perfect Apple TV Sound / Video Setup

One of the big issue it’s Setup wrong the ATV!

The perfekt settings. ( you can read that online be apple )

Small Apple TV 4K Special: Optimal settings of picture and sound

It’s time again for a little special in which I would like to briefly explain the settings on the Apple TV 4K.

For me, the Apple TV 4K is currently the best streaming device that delivers the best picture and sound quality, if you can obtain this with regard to video streaming via the Internet.

It should be clear to everyone that the image and sound are still preferable in physical media and does not require further explanation.

I am a long-time user of all Linux derivatives such as Neutrino, Enigma, TitanNit, Enigma2, Android and their offshoots such as FireOS. In addition to various Linux satellite receivers, I also had the Amazon FireTV in use on hardware since Gen.1 and from the beginning also the problems with frame rate adjustment… I had really invested a lot of time for feedback to Amazon support at the time and also explained the background to why 24p is so important. At some point, however, you no longer feel like the android-heavy software piece.

Whether Fire Stick or Fire TV 4K, when the Apple TV 4K came out, and the Amazon Prime app was also FINALLY available on Apple, there was no reason to continue to rely on the FireTV.

Since then, I have been using three Apple TV 4K, the current Apple TV 4K (6. Generation, 64GB RAM) in the basement home theater, an Apple TV 4K (5. Generation, 32GB RAM) in the living room in the family business and another Apple TV 4K (5. Generation, 64GB), which can be present when traveling and on vacation. By the way, the travel Apple TV 4K then retrieves its data via WLAN hotspot via the iPhone.

Unfortunately, the Apple TV 4K does not have an internal Internet browser, so the free hotel Wi-Fi is not usable due to a lack of authentication.

Netflix, Amazon Prime, Disney+ and AppleTV+ can be easily used with the Apple TV 4K and with the best picture and sound quality.

When initializing, you have to pay attention to a few points, but then you almost have the perfect streaming player!

Exactly, streaming, and first of all nothing else should be able to do the thing. I don’t care if Kodi forks such as MrMC or Infuse from NAS can’t output Atmos sound with TrueHD core. For such gimmicks, at least in my case, you have a reasonable home cinema HTPC, with madVR (and active HSTM), KODI (with Python support…) and MPC-BE or DS player…


It is best to set the resolution firmly to 4K/50Hz/SDR, as the menu is displayed in SDR anyway, as well as all available standard FullHD material from the various providers.

Then the automatic SDR/HDR switch and automatic frame rate adjustment are activated in the settings. This enables the Apple TV 4K to automatically switch between the contrast ranges SDR, HDR10 and Dolby Vision as well as the various frame rates 23,976Hz, 24,000Hz, 50Hz and 60Hz, depending on the provider’s source material available.

So you usually don’t have to worry about anything else.

However, if the auto-update function is activated, the above settings may be reset to default after an update has been installed. This is a bug and Apple is also known. So you should keep an eye on this.


Maximum sound is available when streaming, no matter from which provider, in lossy compressed Dolby Digital Plus (bitstream). So a maximum of 7.1 in the channel bed, possibly also provided with Dolby Atmos metadata.

The Apple TV 4K decodes the bitstream DD+ source material and then converts everything internally to 7.1 linear PCM sound. By the way, there are no disadvantages due to this conversion to PCM, which is often incorrectly reported.

Important: If Dolby Atmos is enabled in the Apple TV 4K menu, the sound output format changes in the case of “DD+ with Atmos metadata” in the source material:

A special signal format called “Dolby MAT” is then output. In the AV receiver, " Dolby Atmos / PCM" or similar is usually displayed.

Attention: “Dolby MAT” does not contain a 7.1 channel bed, so no upmixer such as DTS Neural:X or the Auro-Matic can be activated! Upmixers require an appropriate channel bed to which the upmix algorithms can be applied. With Dolby MAT, this is therefore not possible, as it simply does not exist.

So if you also want to use the upmixers for Atmos sources, “Dolby Atmos” must be deactivated again in the menu!

By the way, Dolby Digital+ is the lossy compressed encoded bitstream, but it is not output from the Apple TV, but it is ALWAYS decoded internally to 7.1 LPCM, which usually only happens in the AV receiver. Only in the case of Atmos in the DD+ container will Dolby MAT be output.

So the AV receiver does not know in which format the sound originally was available, whether as DD+ the DD 5.1. The AVR always displays multichannel PCM or Atmos/PCM.


If the source material contains Dolby Digital Plus 7.1 as a audio track, then the Apple TV 4K will decode internally to Multichannel PCM 7.1

If the source is available in Dolby Digital 5.1, then it is decoded according to PCM 5.1.

With the present stereo sound or Dolby Digital 2.0 therefore according to PCM 2.0.

So it always depends on the respective starting material how many channels are converted into multi-channel PCM sound.

But there is one exception:

If you now convert the sound format in the Apple TV 4K because you have activated the output of Dolby Digital 5.1, then in a first step you will first decode and convert to PCM and then re-encode back to Dolby Digital 5.1 in a second step!

A source with Dolby Digital Plus 7.1 (including encoded atmosphere) becomes LPCM 7.1 and then Dolby Digital 5.1 (losing two lower-level channels). Atmos also ignores the top channels.

Native source material in 7.1 Dolby Digital Plus (including encoded atmosphere) would therefore be converted to 7.1 LPCM and then re-encoded to Dolby Digital 5.1.

The resulting DD5.1 signal then has nothing to do with the original signal. Therefore, this two-time conversion is the worst possible solution.


There, the “HDMI signal format” should be set from default “standard” to “optimized”!

The HDMI signal format sets the possible data bandwidth for the HDMI inputs.

“Standard” corresponds to 10 Gbit/s, which limits the input to a maximum of 4K/24p/HDR!

Only with the “Optimized” setting is the input set to full bandwidth of 18 Gbit/s, which also enables the playback and processing of signal types such as 4K/BT2020/60p/4:2:2/HDR 10bit and 4K/BT2020/60p/4:4:4 8bit.

Video Settings advice

petition True Lossless Dolby Atmos support on the Apple TV 4K

There is a ton of info and many other ways to support the requests for improvements in a currently running thread here.

I moved your post here because it fits into the Apple device trouble shooting help.

The other thread was for a specific problem that has been marked as solved and that thread will be locked after a period.

For Europe

For the United States 60Hz should be used. (and other countries that use 60Hz for AC mains)

1 Like

It makes no difference if it is set to 50 Hz. If the movie is broadcast e.g. in 60 Hz and your TV supports 60 Hz, then the ATV switches your TV to the corresponding signal (this is only an info in the Apple link is also included if your TV supports it)

Switching isn’t seamless and absolutely nothing uses 50hz in North America … and Argentina, Bolivia and a few Caribbean islands (that are British possessions) are the only exceptions in all of the Americas. As such, there’s no purpose served for most users on the western half of the globe to set this to anything but 60hz.

It was a valid point worth making to the billion or so of us living in ‘the new world’.