Optical Cable or Digital Coax

NJB

New member
Nov 28, 2008
75
0
0
Visit site
I know that people love cable topics :)wall: ), so here is a question for all you experts.

I want to connect my iPod dock to my DAC. I have a choice of fibre optic cable or digital coax. The distance between the components is measured in fractions of a meter. Clearly, the signal that I wish to transfer is digital and so I just have to get a stream of '0' and '1' from one place to another. Any errors will cause the DAC to apply error correction and thus 'make up' the music based upon its algorithms.

With digital coax I might have signal losses at the connectors and in the cable. These are less of a problem (provided that the DAC can still distinguish between the '0' and '1') than any VSWR issues which could lead to bitstream errors and error correction in the DAC. On a short cable the propogation losses do not really need to be factored in, but the effect of external interference does and so effective screening is important.

With fibre optic I have to convert the signal to a light stream and decode this at the other end. This introduces a new process which will have its own error correction circuitry but the VSWR risk is much reduced. Again, propogation losses are trivial, and this time interference is unlikely.

So, I suspect that a reasonable quality coax will provide a more accurate data stream at the DAC as the optical conversion process carries the most risk of creating errors. Any views?
 

cheeseboy

New member
Jul 17, 2012
245
1
0
Visit site
NJB said:
The distance between the components is measured in fractions of a meter.

given that statement I'd be suprised you'd get any chance of any errors through either choice, other than a faulty cable.

Why not pick up a cheapy optical and co-ax cable and see if you hear anything, then take it from there, as without being in your house and listening to your system, i feel it's a bit of a shot in the dark to say one or the other... To be honest, I'd be surprised if you head any difference between either.
 
T

the record spot

Guest
You're making a problem where none exists in reality. Buy Fisual off the web and plug them in. If the cable is a duffer the signal won't pass. Otherwise the issue is pretty academic and not an audible one.
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
NJB said:
I know that people love cable topics :)wall: ), so here is a question for all you experts.

I want to connect my iPod dock to my DAC. I have a choice of fibre optic cable or digital coax. The distance between the components is measured in fractions of a meter. Clearly, the signal that I wish to transfer is digital and so I just have to get a stream of '0' and '1' from one place to another. Any errors will cause the DAC to apply error correction and thus 'make up' the music based upon its algorithms.

With digital coax I might have signal losses at the connectors and in the cable. These are less of a problem (provided that the DAC can still distinguish between the '0' and '1') than any VSWR issues which could lead to bitstream errors and error correction in the DAC. On a short cable the propogation losses do not really need to be factored in, but the effect of external interference does and so effective screening is important.

With fibre optic I have to convert the signal to a light stream and decode this at the other end. This introduces a new process which will have its own error correction circuitry but the VSWR risk is much reduced. Again, propogation losses are trivial, and this time interference is unlikely.

So, I suspect that a reasonable quality coax will provide a more accurate data stream at the DAC as the optical conversion process carries the most risk of creating errors. Any views?

In answer to your questions.

S/PDIF does not have error correction, but does have error detection. As far as I am aware, most DACs just ignore frames with incorrect parity.

Over a short distance, both Coax and Optical will be 100% error free.

The light conversion process for Optical does not have error correction or detection.

With correctly terminated Coax, VSWR issues are negligable.

Having said all that, if you want to choose between them, the arguments are:

Optical provides galvanic isolation, but is more likely to introduce jitter than Coax

Coax may transport ground plane noise through the screen to the DAC, but has better jitter characteristics than Optical.

You pays your money, and you takes your choice.
 

NJB

New member
Nov 28, 2008
75
0
0
Visit site
So, I have spent some time today running back to back comparisons. The back story, just so you don't think that I am a complete cheapskate, is that I bought an external DAC but have no proper cables. I have a real cheapo optical cable which cost £3.99 off the web some years ago, and a digital coax that was £5 from the local electrical shop. The afternoon was interesting. Both cables sounded far too bright at the start, but the hifi was cold and it takes a while to mellow. From then one, it was a no brainer. The optical link is far and away the better connection. There is more bass, better balance and more detail. I also spent the time trying to believe that they sounded the same as my brain says they ought to be identical, but they are clearly not. How much is due to the cheap cables, and how much is due to the optical and coax circuits in the iPod Dock and DAC remains a mystery.

I have just ordered some better connections, I am in Switzerland and the Swiss do not seem quite as hifi-centric as the UK, so I have ended up going for some German Inakustik cables that are still a rarity in the UK but popular over here. They have to be better than the ones that I have and I might need to repeat my comparison once I have a better coax link.
 

unsleepable

New member
Dec 25, 2013
6
0
0
Visit site
It would be great if you posted the result after trying with a better coaxial cable. I would have thought that differences are due to the electrical and optical S/PDIF implementations.

What devices are you connecting?
 

NJB

New member
Nov 28, 2008
75
0
0
Visit site
I am connecting a Pure i20 iPod Dock to a Beresford Bushmaster 2 DAC.

Unsleepable, you comments match my thoughts. It is probably more to do with the electronics at either end rather than the cables. However, that is a touchy subject on this forum because the world is split down the middle on whether digital cable performance matters at all. What I can say, stating the obvious really, is that the optical solution changes the digital stream to optical and back, which has 2 more chances to introduce errors. Also, if the DAC does not try to reconstruct faulty data and just ignores it, as it has been previously mentioned on this thread, then timing errors etc become a problem and the fidelity of the bitstream from the i20 dock is critical.
 

Thompsonuxb

New member
Feb 19, 2012
129
0
0
Visit site
You know its a pity you cannot go up budget on both cables both coax and optical. The reason I say this is having tried both - cheapo optical cable, came free with XBOX 360elite - my preference is coax more body to the sound through my amps DAC saying that the coax cost from cheapo freebies upto 50£ - the best sounding cost 25£.

anyhoo let us know how you get on.
 
T

the record spot

Guest
Said it before, say it again, Fisual. Great build, cheap as chips, performance excellent. I've got a QED optical cable that was about £20. You couldn't tell them apart. Seriously guys, wake up... :O
 

unsleepable

New member
Dec 25, 2013
6
0
0
Visit site
NJB said:
I am connecting a Pure i20 iPod Dock to a Beresford Bushmaster 2 DAC.

Unsleepable, you comments match my thoughts. It is probably more to do with the electronics at either end rather than the cables. However, that is a touchy subject on this forum because the world is split down the middle on whether digital cable performance matters at all. What I can say, stating the obvious really, is that the optical solution changes the digital stream to optical and back, which has 2 more chances to introduce errors. Also, if the DAC does not try to reconstruct faulty data and just ignores it, as it has been previously mentioned on this thread, then timing errors etc become a problem and the fidelity of the bitstream from the i20 dock is critical.

I believe the DAC cannot reconstruct data coming from a S/PDIF port if it gets corrupted. There isn't enough redundant information for that. There is a single parity bit that will allow the receiver to detect up to 1-bit erros, but that's all. I think andyjm summarized pros and cons very well in his previous post.

I also agree with him that errors are very unlikely in this kind of point-to-point links. When errors occur, they won't normally be single frames, and then should be clearly audible.

And for some reason it is generally assumed that jitter affects TOSLINK more than coaxial connections. And I am not saying that this is not true, but honestly, I don't know why… The dependance of the receiver on the clock of the source to recompose the audio for real-time playback is exactly the same whether electrical or optical.
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
unsleepable said:
And for some reason it is generally assumed that jitter affects TOSLINK more than coaxial connections. And I am not saying that this is not true, but honestly, I don't know why… The dependance of the receiver on the clock of the source to recompose the audio for real-time playback is exactly the same whether electrical or optical.

S/PDIF is sent as a self clocking bitstream, the clock is embedded in the data. This is done using a technique called 'manchester biphase mark encoding'. In very simple terms the bitstream changes state at the end of every bit, but also changes state in the middle of every '1'. One implication is that '0's effectively have half the frequency of '1's.

Every system is band limited, and in a band limited system the detection point of a '0' can be different to the detection point of a '1', given the double frequency nature of the encoding. This can lead to jitter being introduced into the recovered clock. For those interested:

http://audioworkshop.org/downloads/AES_EBU_SPDIF_DIGITAL_INTERFACEaes93.pdf

Anyway, the upshot is that generally the TOSLINK implementations in audio equipment have worse frequency response than coax (of the datalink, not the resulting audio) because of the additional steps required and this worse frequency response translates into additional jitter on the recovered clock.

To make matters worse, this jitter is 'code correlated' which means it is related to the programme signal in some way, and is not an entirely random process. Correlated jitter has been found to be more easily detected than random jitter.

The truth is that the S/PDIF interface was good for its time, but is now past its sell by date. The clock in a DAC needs to be next to the D2A converter chip, and that means that the DAC needs flow control on the incomming data. Async USB or some other protocol with flow control is the way to go.
 

DocG

Well-known member
May 1, 2012
54
4
18,545
Visit site
andyjm said:
unsleepable said:
And for some reason it is generally assumed that jitter affects TOSLINK more than coaxial connections. And I am not saying that this is not true, but honestly, I don't know why… The dependance of the receiver on the clock of the source to recompose the audio for real-time playback is exactly the same whether electrical or optical.

S/PDIF is sent as a self clocking bitstream, the clock is embedded in the data. This is done using a technique called 'manchester biphase mark encoding'. In very simple terms the bitstream changes state at the end of every bit, but also changes state in the middle of every '1'. One implication is that '0's effectively have half the frequency of '1's.

Every system is band limited, and in a band limited system the detection point of a '0' can be different to the detection point of a '1', given the double frequency nature of the encoding. This can lead to jitter being introduced into the recovered clock. For those interested:

http://audioworkshop.org/downloads/AES_EBU_SPDIF_DIGITAL_INTERFACEaes93.pdf

Anyway, the upshot is that generally the TOSLINK implementations in audio equipment have worse frequency response than coax (of the datalink, not the resulting audio) because of the additional steps required and this worse frequency response translates into additional jitter on the recovered clock.

To make matters worse, this jitter is 'code correlated' which means it is related to the programme signal in some way, and is not an entirely random process. Correlated jitter has been found to be more easily detected than random jitter.

The truth is that the S/PDIF interface was good for its time, but is now past its sell by date. The clock in a DAC needs to be next to the D2A converter chip, and that means that the DAC needs flow control on the incomming data. Async USB or some other protocol with flow control is the way to go.

Hi Andy,

I read that the I2S signal, as used in a CDP, between the laser and the DAC, is superior to SPDIF, because it sends the clock data separately. Could that give the CDP an edge over a transport + DAC? Could I2S be a way forward? Some DACs (PS Audio, Wyred4Sound, ... ) have an I2S input over HDMI.
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
DocG said:
andyjm said:
unsleepable said:
And for some reason it is generally assumed that jitter affects TOSLINK more than coaxial connections. And I am not saying that this is not true, but honestly, I don't know why… The dependance of the receiver on the clock of the source to recompose the audio for real-time playback is exactly the same whether electrical or optical.

S/PDIF is sent as a self clocking bitstream, the clock is embedded in the data. This is done using a technique called 'manchester biphase mark encoding'. In very simple terms the bitstream changes state at the end of every bit, but also changes state in the middle of every '1'. One implication is that '0's effectively have half the frequency of '1's.

Every system is band limited, and in a band limited system the detection point of a '0' can be different to the detection point of a '1', given the double frequency nature of the encoding. This can lead to jitter being introduced into the recovered clock. For those interested:

http://audioworkshop.org/downloads/AES_EBU_SPDIF_DIGITAL_INTERFACEaes93.pdf

Anyway, the upshot is that generally the TOSLINK implementations in audio equipment have worse frequency response than coax (of the datalink, not the resulting audio) because of the additional steps required and this worse frequency response translates into additional jitter on the recovered clock.

To make matters worse, this jitter is 'code correlated' which means it is related to the programme signal in some way, and is not an entirely random process. Correlated jitter has been found to be more easily detected than random jitter.

The truth is that the S/PDIF interface was good for its time, but is now past its sell by date. The clock in a DAC needs to be next to the D2A converter chip, and that means that the DAC needs flow control on the incomming data. Async USB or some other protocol with flow control is the way to go.

Hi Andy,

I read that the I2S signal, as used in a CDP, between the laser and the DAC, is superior to SPDIF, because it sends the clock data separately. Could that give the CDP an edge over a transport + DAC? Could I2S be a way forward? Some DACs (PS Audio, Wyred4Sound, ... ) have an I2S input over HDMI.

In an ideal world, the clock used to drive the D2A chip in a DAC would be completely stable, with no jitter.

Putting a high quality clock anywhere except right next to the D2A chip is just asking for trouble. Putting the clock in a different box, and embedding the clock into the data, then sending it down a S/PDIF link is not a recipe for success. Pretty much anything is better than that.

So there are good reasons why a CDP could be better than a separate DAC if it was able to manage the clock better (via I2S or whatever), in the same way a streamer with built in DAC could be better than a separate DAC for the same reasons.

If the desire is to have a separate DAC, the clock should be in the DAC and not the transport (be it CD, streamer or whatever). This means the DAC needs to be able to control the speed of data fed to it, and async USB is the easy and obvious way to do it.
 

davedotco

New member
Apr 24, 2013
20
1
0
Visit site
Interesting and informative as ever Andy.

None of this is exactly new though, many will remember the Audio Alchemy Dac in the Box of the early 90's but the next model up the range did include an I2S interconnect and a reclocking device to sit between it and the transport, ie Transport to processor via SPDIF, signel 'reclocked' and transfered to the dac via a 3 pin (not XLR) I2S cable. This was at least 20 years ago....!

There is also the issue of impedance, standard RCA digital interconnects do not meet the correct 75 ohm impedence, which in theory can lead to increased jitter, just another blindspot that the industry has for certain aspects of digital audio.
 

busb

Well-known member
Jun 14, 2011
83
5
18,545
Visit site
My M-DAC has a clock output for any transport capable of syncing to it. The advantage of a separate clock is that every transition is sync'd where clock recovery can only work well with very short bursts of ones or zeros (I'm trying to remember my datacomms theory but seem to recall the idea of RZ, NRZ & complimentary data streams that use both + & neg going data - must refresh my knowledge).

Some engineers have stated that moderate jitter is inaudible - I don't have sufficient knowlege at this juncture.
 

cheeseboy

New member
Jul 17, 2012
245
1
0
Visit site
busb said:
My M-DAC has a clock output for any transport capable of syncing to it. The advantage of a separate clock is that every transition is sync'd where clock recovery can only work well with very short bursts of ones or zeros (I'm trying to remember my datacomms theory but seem to recall the idea of RZ, NRZ & complimentary data streams that use both + & neg going data - must refresh my knowledge).

Some engineers have stated that moderate jitter is inaudible - I don't have sufficient knowlege at this juncture.

master clocks are only really needed in complex setups. As far as computers and digitial audio goes, playing back is one of the least complex tasks it will have to do, so to worry about jitter in instances like that is like worrying that a tsunami is coming because there's a few spots of rain. There's a really good article on master clocks and suchlike here http://www.soundonsound.com/sos/jun10/articles/masterclocks.htm
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
So, a 'masterclock' is used in a studio environment where multiple digital devices need to stay in step with each other. While it may be (and hopefully is) a very accurate clock, its purpose isn't primarily about jitter reduction it is about ensuring overall studio synchronisation.

A DAC with a 'wordclock' or similar output is a very crude form of flow control. The idea here is that the DAC has a high quality clock located adjacent to the D2A converter chip to control conversion rate (which is a good thing). The problem is that the source (streamer, CD player, PC or whatever) may send data a bit too fast or a bit too slow resulting in buffer over-runs or under-runs in the DAC. To control this, the DAC sends a pulse every 16 or 8 bits out of the wordclock output to the device sending the data to keep it in step.

The problem with this sort of wordclock interface is that there is no real standard, and it needed another connector from DAC to source. Much better to use an existing, widely available, cheap, bi directional interface that had all the flow control built in. USB.
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
cheeseboy said:
busb said:
Some engineers have stated that moderate jitter is inaudible - I don't have sufficient knowlege at this juncture.

As far as computers and digitial audio goes, playing back is one of the least complex tasks it will have to do, so to worry about jitter in instances like that is like worrying that a tsunami is coming because there's a few spots of rain.

Jitter for beginners.

There are posters on this site that fret about 16 bits vs 24 bits of resolution, and to be fair the mental image of a sinewave split into little steps is quite compelling. More steps is good, because that leads to a more accurate representation of the original signal.

So picture that sinewave on a piece of graphpaper. To sample the sinewave, each time there is a tick on the X axis you read off the height of the wave and make a note of it. Obviously, getting the height measurement right is key, and this is where the number of bits resolution comes in. But what if the ticks on the X axis aren't quite in the right place, some a little to the left of where they should be and some a little to the right? Then you will measure the height of the sinewave in slightly the wrong place, and you will measure slightly the wrong height.

This is what jitter does, it means either the original sample is taken at the wrong place (during recording) or the right sample is played back in the wrong place (during playback). It is just as bad as having the wrong sample, and using some smart maths, a given level of jitter can be translated into a reduction in signal to noise ratio, or equally a reduction in the number of bits of resolution of the system.

For those who have bothered to follow this far, it gets more complicated. If you are using a dud ruler to measure the height of the wave at each sample point, it will get the height wrong, but consistently wrong. Jitter is more like measuring the height of the wave when you are drunk, you may get the right height, but you may be a bit off. This is a random (or quasi-random) process and the distribution of this process can significantly impact how noticable it is.

So don't dismiss jitter, it is tougher to grasp but it is just as important as number of bits of resolution, dynamic range and frequency response. Gross levels of jitter are definitely audible, but the are many arguments about what level it really does become audible. There are many studies on the net on this subject, some of which are many years old, and in my opinion of limited use. It is certainly true that decently designed modern equipment has levels of jitter that most engineers agree is inaudible, but equally most engineers agree 24 bits and 96KHz sample rate is a waste of time.
 

busb

Well-known member
Jun 14, 2011
83
5
18,545
Visit site
andyjm said:
cheeseboy said:
busb said:
Some engineers have stated that moderate jitter is inaudible - I don't have sufficient knowlege at this juncture.

As far as computers and digitial audio goes, playing back is one of the least complex tasks it will have to do, so to worry about jitter in instances like that is like worrying that a tsunami is coming because there's a few spots of rain.

Jitter for beginners.

There are posters on this site that fret about 16 bits vs 24 bits of resolution, and to be fair the mental image of a sinewave split into little steps is quite compelling. More steps is good, because that leads to a more accurate representation of the original signal.

So picture that sinewave on a piece of graphpaper. To sample the sinewave, each time there is a tick on the X axis you read off the height of the wave and make a note of it. Obviously, getting the height measurement right is key, and this is where the number of bits resolution comes in. But what if the ticks on the X axis aren't quite in the right place, some a little to the left of where they should be and some a little to the right? Then you will measure the height of the sinewave in slightly the wrong place, and you will measure slightly the wrong height.

This is what jitter does, it means either the original sample is taken at the wrong place (during recording) or the right sample is played back in the wrong place (during playback). It is just as bad as having the wrong sample, and using some smart maths, a given level of jitter can be translated into a reduction in signal to noise ratio, or equally a reduction in the number of bits of resolution of the system.

For those who have bothered to follow this far, it gets more complicated. If you are using a dud ruler to measure the height of the wave at each sample point, it will get the height wrong, but consistently wrong. Jitter is more like measuring the height of the wave when you are drunk, you may get the right height, but you may be a bit off. This is a random (or quasi-random) process and the distribution of this process can significantly impact how noticable it is.

So don't dismiss jitter, it is tougher to grasp but it is just as important as number of bits of resolution, dynamic range and frequency response. Gross levels of jitter are definitely audible, but the are many arguments about what level it really does become audible. There are many studies on the net on this subject, some of which are many years old, and in my opinion of limited use. It is certainly true that decently designed modern equipment has levels of jitter that most engineers agree is inaudible, but equally most engineers agree 24 bits and 96KHz sample rate is a waste of time.

just how much jitter causes a deteriation in sound can be measured objectively in the digital domain - introduce jitter (similar to FM modulation) up to the point where the bit stream ceases to be bit perfect seems to be a viable measurement?
 

andyjm

New member
Jul 20, 2012
15
3
0
Visit site
busb said:
just how much jitter causes a deteriation in sound can be measured objectively in the digital domain - introduce jitter (similar to FM modulation) up to the point where the bit stream ceases to be bit perfect seems to be a viable measurement?

16/44.1 has a bitrate of approx 1.4Mb/s. That gives a bit period of about 700nS. Although S/PDIF doesn't work quite like this, if the sample clock was jittered by more than half a bit period (350nS), this could lead to bit errors.

Modern digital audio gear measures with jitter in the sub nS range. You have to be careful as jitter is a random process and many manufacturers aren't clear whether they are quoting peak jitter or RMS jitter. The squeezebox transporter has the lowest jitter of any device I have seen (in a spec) at 15pS RMS, I recall the Sonos connect (or whatever it is called now) comes in at around 200pS RMS jitter.

None of these numbers would lead to bit errors, as I have posted before, for short runs, either coax or optical S/PDIF is 100% error free.

As for when jitter becomes audble, there are studies out there that imply anywhere between hundreds of nS to hundreds of pS.
 

TRENDING THREADS