Apple TV 4K – Why does 5.1 CH on HDMI audio extractor trigger Lossless playback?


Well-known member
Nov 29, 2021
With the help of you kind people on this forum, I now have Apple Music streaming with my Apple TV 4K using an HDMI Audio Extractor to get an optical audio feed into a DAC before playing through my analogue amplifier and speakers.

1. I knew that the Apple TV up-sampled all digital audio to 48 kHz, but I was puzzled that the Music app didn’t show the Lossless icon on a song until I set the switch on the HDMI audio extractor to 5.1 CH from 2 CH. Please can someone help me understand why that might be?

2. There seems to be a lot of controversy over ‘bit perfect’ streaming and up-sampling. I know my own ears should be the judge, but does the up-sampling improve or degrade the audio? I ask because I remember that the Arcam CD192 CD player I have (but no longer use) boasted about its up-sampling capability, from the manual:

'The CD192 incorporates the latest audio up-sampling technology to convert the 44.1kHz data from the CD up to 192kHz in 24-bit precision using the Analog Devices AD1896 sample rate converter.

The 192kHz audio data is converted into the analogue domain using four Wolfson Micro WM8740 24-bit stereo DACs. Left and right output channels are made using
two stereo DAC chips per channel, both operating in dual mono configuration per channel. The resulting four mono signals (two for each channel) are summed together to help improve dynamic range, linearity and therefore distortion.

Due to the higher sample rates, the digital filters roll off less aggressively than for standard DACs, and the analogue filters have a wider bandwidth resulting in less phase distortion, particularly for the higher frequencies.'

I don't really understand the last sentence, but obviously they thought up-sampling was a good thing back then.


Latest posts