insider9 said:
It's great to quote specs and measurements but could you tell me how does jitter of 50 picoseconds sound? How much of an impact does it have on sound quality? What if it was only 5 pico seconds? What differences would you hear? Secondly how would it show up on acoustic measurements?
Until we understand how specific measurements translate to what we hear it's probably safe to say they're as informative as values of capacitors in power supply.
Going back to my analogy where the waveform was being plotted out on a piece of graphpaper, and the Y values are the sample size and the X values are assumed to be equal increments of X (1,2,3,4,5... and so on) then it is possible to calculate how much the X axis has to be in error to be equivalent to the Y value being in error by 1/(65536*2) - in other words the resolution of the analogue output being reduced from 16 bits to 15 bits. This seems a good place to start for the point that jitter matters.
Unfortunately there is more to it than that. Jitter is a random process, and the distribution of jitter seems to matter. I will try and find the link to the study, but entirely random jitter is far less detectable than jitter that is linked to the signal in some way. S/PDIF in particular can be subject to 'code correlated jitter' where the density of 1s vs 0s in the data bitstream makes the detection of the clock slightly early or late. So the jitter is then linked to the music and apparently much easier for a listener to detect.
As for measurement, jitter is tricky to measure directly without some very fancy gear. Most approaches measure the effects of jitter, not the jitter itself. Jitter is a bit like modulation in that the instantantaneos frequency of the jittered signal is altered by the jitter. So if you play a pure sinewave of a given frequency through a DAC, and look at the output on a spectrum analyser, any frequency that shows up in the output that isn't the original sinewave is the result of jitter (amongst other things).
If you do the maths, then about a 100ps of clock error at 10KHz or 20ps of clock error at 20KHz is equivalent to 1/2 LSB of a 16bit resolution.
I didn't pay the fee, but apparently this paper has the derivations:
http://www.aes.org/e-lib/browse.cfm?elib=6111
This is the point jitter begins to effect the overall resolution of the system, that is not the same as the point at which it becomes detectable. Other studies suggest 250nS is the threshold of detection, but as I mentioned above it depends what type of jitter you are talking about. It also depends what jitter numbers you use - its a distribution, so are you quoting peak, RMS, average.....