Digital cable debates. Have we been barking up the wrong tree?

idc

Well-known member
EDIT - I enjoyed the hours of reading and checking to produce this thread, but on posting I think it is wrong to make people read for hours to understand my thoughts. So I have made this summary.....

It is not just about 1s and 0s and some saying they have heard differences despite the 'science'. We need to look at the effects of jitter, which is the timing of the sending of the 1s and 0s, to find a variable which can account for both sides of the argument being correct or wrong, as they have been barking up the wrong tree.

The detail as to why.......

So often digital cable debates are dominated by two camps and descend into circular arguments. One states that it is just 1s and 0s and the other claims they can hear a difference.

I cannot rationalise numerous credible reports from music publications such as What Hifi and forum members who say they can hear a difference, with those equally credible people who say they hear no difference and credible arguments that show there is no difference. So I have been reading and this is what I have come up with.

I am not a scientist. My only experience of different digital cables are the various USB cables I have connected my laptops to DACs with. I cannot say I have noticed any difference. I have done with analogue cables. I do listen analytically, I have good hearing, I can pick out different codecs and bit rates used in music files. I have tester tracks that I am very familiar with to listen for differences. So I am open minded to both side’s arguments.

The main argument that digital cables are the same is that it is just 1s and 0s. For example, you do not have any issues sending a document to a printer down a USB cable. The reason for that is error correction ensures the correct 1s and 0s are sent. If there is an error, data is resent or it breaks up and you would really notice that happening. If the signal does not break up error correction is working fine and the cable has no effect on this. The same applies to both audio and visual digital data transmission. Here I concentrate on audio. I have read nothing that suggests somehow 1s can become 0s or vice versa in a cable. Again error correction stops that. So what are we left with that can be variable?

A word that kept coming up whilst I was reading is jitter. From an online dictionary jitter is “a flicker or fluctuation in a transmission signal or display image. The term is used in several ways, but it always refers to some offset of time and space from the norm. For example, in a network transmission, jitter would be a bit arriving either ahead or behind a standard clock cycle or, more generally, the variable arrival of packets”.

The timing of the data sent is not important to print something. It just needs to be bit perfect and arrive in tact. But timing is very important for music. Furthermore, with music there are very high rates of data being sent that need to be converted to an analogue signal to produce sound in real time, perfectly timed.

To deal with timing the transmission of the data ‘clocks’ are used. The clock is a crystal oscillator where vibration is used to measure time. When data is transmitted a clock signal is also sent. According to Remy Fouree in Stereophile the clock signal is ‘embedded in the pulse edges’. So you could say it is 1s and 0s and a tick that is actually transmitted. Any component involved in digital data transmission has a clock. A DAC has one, including the DAC in your ipod and CD player. For there to be no jitter these clocks have to be incredibly accurate and if more than one is being used at a time, totally in time with each other.

But the clocks themselves can cause jitter, as can any part of the route a digital signal takes from the moment it is recorded and an ADC converts the analogue signal to digital to the point it is converted back to analogue.

So what is the audio effect of jitter? John Swenson from Computer Audio Asylum describes the effect of jitter as a “loss of inner detail”, “a flatter sound” and “improvement in bass articulation”. Cambridge Audio states that jitter is a distortion which is audible as the ear picks out timing errors and needs timing to place information correctly. So jitter “blurs the signal“. Steve Nugent from Empirical Audio describes jitter as looking through old, dirty glass compared to a modern clean pane. Elias Gwinn from Benchmark also attributes jitter to degraded sound quality. A paper by Gannon Kashiwa describes it as ‘noise’ like an added voltage on an analogue signal. Discussions on various forums also attribute jitter to having an over all effect on sound quality as opposed to it having a sound itself.

There is an argument that jitter is not audible below a certain rate, details from Hydrogen audio and numerous linked research papers. But there is no definite agreement what level that is. There are also reports that jitter does not necessarily affect sound quality detrimentally, such as tests by John Swenson. He reports that some audio products which have a higher jitter than others “sound better”. Gannon Kashiwa points to a lack of consistency between the theory and actual experience. Steve Nugent points out that such study is still in its infancy. But there is an acceptance that jitter and timing errors, no matter how small is very likely to be detrimental to audio.

How can or do cables affect jitter? Elias Gwinn describes a cable’s effect on jitter as

“Different cables have different impedances (resistance, capacitance, and inductance), which forms a filter that attenuates high-frequencies. When a digital clock changes states (from a '1' to a '0', for example), it ideally changes instantaneously, but a lack of high-frequency capability causes the transition to be slower.
Also, noise creates inaccuracies during the transition. The most common culprit of noise is signal reflection. Signal reflection is minimized by proper impedance termination and the proper characteristic impedance of the cable. However, poor cable shielding will also allow EMI noise to impurify the signal”

Steve Nugent describes cable effects as not adding to jitter, but cables can slow the signal transitions (the difference between 1s and 0s) which makes it harder for the DAC receiving the data to get the timing correct.

TNT Audio refer to “Line Induced Jitter” where bandwidth limits make transitions slower than ideal and “Interfering-noise induced Jitter” such as the effect of EMI on cables.

Remy Fouree refers to “electrical noise”, “bandwidth limitation“ and “mismatched between transmission and receiver impedance“. He goes on to say that any two cables with the same impedance, no matter the cost will have the same effect on jitter in the same setup. It appears to be your luck as to whether you find the $3 or $300 cable first.

So I do not have a conclusive answer to the digital cable debates. But there is enough evidence to say that the real discussion about digital cables should be about jitter and its effect on data transmission.

In particular with cables it should be how much the cables effects the transmission of data which in turn effects timing and so causes jitter. A perfect cable will have no effect on the jitter contained in the data signal sent and should be a perfect match between sender and receiver.

Here comes the potential irony.

It is debatable that zero jitter is possible, but if you have a very low jitter transport with a very accurate clock sending a low jitter signal to another accurate clock in a very good DAC, the effect of a cable could go either way. It may have no effect as it too causes no jitter. Or it is the cause of a lot of jitter. Different systems will produce different amounts of jitter. Some will possibly even find that jitter may improve the sound for them.

Benchmark are confident enough in their DAC’s ability to deal with jitter that they say it does not matter what transport or cable you use. So, as DACs get better at rejecting jitter, the case for buying a ‘better’ digital cable or claiming different cables have different effects weakens.

That helps to explain why some people hear differences in cables and others do not and as knowledge of jitter and its effects is improved, potentially both sides are correct! Though to prove that we would need a massive test of each persons susceptibility to noticing jitter and how much jitter is present in their system and in particular the cable used.

Sources - I have not linked to any of the sources I have used as they are primarily found through other forums or may be copyrighted (I have not quoted from any source that is clearly marked copyrighted). If you google terms like ‘jitter‘, ‘what does jitter sound like?’ and the names and companies I have referred to you should be able to track them down. The quote from Benchmark’s Elias Gwinn was an answer to a question I put to him on a forum where I have also communicated with Empirical Audio‘s Steve Nugent. I actually have no idea who any of these people really are, but they are referred to a lot in the reading I have done, so I assume they know their stuff enough to satisfy those who want proof and science to the debate.
 

daveh75

Well-known member
Shame on you,Ian.
emotion-1.gif
 
A

Anonymous

Guest
One correction: digital does not necessarily imply error correction and resending data. Afaik Usb audio, SPDIF as well as digital video tru DVI and HDMI are not error corrected, and rely on a sufficiently high SNR that the coded 1's and 0's can still be recovered without error at the other end. Especially video signals are sensitive to degradation in long or badly shielded cables, leading to visible artifacts.

Tcp/ip streaming is error corrected so if there is a failure packets are resend, possibly causing dropouts if this happens too frequently (eg bad wireless connection).

The jitter problem is different for SPDIF and USB. If a computer makes the spdif, the amount of jitter depends on the quality of the audio card/circuitry, the dac is then slave to the rythm. With usb only an approximate clock can be recovered from the arriving data chunks that arrive every ms (there is no pulse signal in the protocol/interface). The signal has to be reclocked in the dac, but still the dac's clock and the computer sending the data (using its clock) can get slightly out of sync, requiring very slick methods for the usb>dac circuitry to adjust it occasionally (I posted earlier how Japanese engineers solved this (http://community.whathifi.com/forums/post/309078.aspx). I believe these adjustments due to having two clocks in the system are infrequent and not audible, and the remaining jitter is the standard jitter of the crystal generating the clock in the DAC, At least MF believes that this jitter (for the VDAC) is extremely low.
 

idc

Well-known member
Thanks for the link Pete. That thread got me started on what has been a very interesting read. It was was too technical for me. I understand it now after much reading.

Hopefully the opening thread is a very non technical explanation of how digital cales and their role in the transmission of digital data is not just about 'error correction' and '1s and 0s'. My argument is that jitter has been overlooked by many and it could well the reason why digital cables can affect sound.

A key point was the one made by Remy Fouree in Stereophile that, with the correct impedance it does not matter how much a cable costs. That alone explains why some report differences and others do no and some report the whorth of buying a more expensive cable and some do not. It is their luck as to whether the cable they have works within their system.

The other key point is that as jitter is dealt with more and more, the role of cables will become less and less.
 
A

Anonymous

Guest
Or try connecting your CD player to your DAC with a cheap + very long RCA cable and listen to the jitter.
 
A

Anonymous

Guest
I'm super non-technical, so is there a short answer to this? I use a USB connection from my laptop to my soundcard (Serato Scratch D/A converter), then to the preamp (mixer).
Is the type of USB cable I use going to make a difference, or not?
If I ask at a computer shop, the answer is no, cables make no difference. But as mentioned in the original post, timing of data transmission isn't so important in most computer applications - just accuracy. Whereas it's critical for my setup (not only playing music, but matching two tracks, with timecode information from the vinyl transmitted to the PC via the soundcard, then back to the soundcard as a music file.ÿ
ÿI ask as there have been a couple of occasions where the whole thing has gone completely *** up in the middle playing a track - it just judders and stops - but only on one channel, and only for a couple of seconds. This has happened while playing out which is obviously not ideal - and only after playing for an extended period (as in several hours).
So is it the programme/computer (which I'll say I can't change), or is it possibly in the cable?
Apologies that this is not about 'proper' or conventional hifi, but I think it's a good forum to discuss anything to do with sound quality.
And in case you don't know how Serato ÿworks in terms of how it's set up, or are at all interested, take a look at somoene's home-made explanation atÿhttp://www.youtube.com/watch?v=InsEDIGYlo4 ÿor go to serato.com.(yes they all seem to use rubbish cables in their demo setups. I upgraded to Atlas Equators and it made a massive difference)
 

Dan Turner

New member
Jul 9, 2007
158
0
0
Visit site
Well done IDC for a well researched and reasoned contribution to the debate. This is what we need, when all too often we end up with people trying to steam-roller their point of view across by insulting anyone that disagrees with them.
 

fatboyslimfast

Well-known member
Jan 10, 2008
158
0
18,590
Visit site
No - Silverbirch are brighter than normal (copper?) birches.

Apologies for that, couldn't resist

Anyway, it was a really interesting read, and does go some way to clarifying why some digital cables - in the right situations - can sound different.
 

idc

Well-known member
Dom Tych:I'm super non-technical, so is there a short answer to this? I use a USB connection from my laptop to my soundcard (Serato Scratch D/A converter), then to the preamp (mixer).

Is the type of USB cable I use going to make a difference, or not?

If I ask at a computer shop, the answer is no, cables make no difference. But as mentioned in the original post, timing of data transmission isn't so important in most computer applications - just accuracy. Whereas it's critical for my setup (not only playing music, but matching two tracks, with timecode information from the vinyl transmitted to the PC via the soundcard, then back to the soundcard as a music file.

I say that the answer to that is.........maybe. Depending on the jitter in your computer and the construction of your cable and the DAC. The thing about jitter is that it is everywhere at all stages. So you may have a jitter free cable, but would never know because in a way it is not important. If your DAC (or soundcard) copes very well with jitter and reduces it to a below audible level, so what? You may even prefer the sound with jitter!

The computer shop are right that it makes no difference. In their applications for a USB cable it doesnt make a difference as error correction and getting the data there are far more important. Even if they are dealing with a sound application, so what if there is a bit of jitter there lowering the sound quality? They will never listen out for that issue.

Dom Tych:

I ask as there have been a couple of occasions where the whole thing has gone completely *** up in the middle playing a track - it just judders and stops - but only on one channel, and only for a couple of seconds. This has happened while playing out which is obviously not ideal - and only after playing for an extended period (as in several hours).

So is it the programme/computer (which I'll say I can't change), or is it possibly in the cable?

Apologies that this is not about 'proper' or conventional hifi, but I think it's a good forum to discuss anything to do with sound quality.

And in case you don't know how Serato works in terms of how it's set up, or are at all interested, take a look at somoene's home-made explanation at http://www.youtube.com/watch?v=InsEDIGYlo4 or go to serato.com.
(yes they all seem to use rubbish cables in their demo setups. I upgraded to Atlas Equators and it made a massive difference)

I would say it is not the cable per say, though you should check your cables and all connections as something may be faulty or lose. That is because drop out and faults are not really attributable to jitter, something has gone wrong with transmission of the whole data. So you do kneed to look at the sender (computer) and receiver (soundcard).
 

Xanderzdad

New member
Jun 25, 2008
146
0
0
Visit site
Thanks for a useful and constructive take on the whole 'heretical' debate around cables. I think you are onto something.

I would also say that peoples ears are incredibly complex instruments and a lot of the debate stems from there. We all 'like' different things and hearing is subjective. How often do you notice part of a music track that nobody else was aware of and yet to you it was something that 'made' the song for you?
 

idc

Well-known member
It has been a labour of love, spurred on when my laptop refused to link to the forum for a couple of days, so I had the incentive to write up all of the stuff I had saved and read.

Xanderzdad, I agree with your point about the ears and a lot of reading and googling was to find out if jitter is audible and if so what does it sound like? Background detail and hearing it when before it was not clear or even there has to be seen as an improvement in sound.

There was a lot of 'opinion' and downright nonsense, but from the sources quoted above (ie the engineers as opposed to reviewers and just general punters) is that you do not hear jitter, you hear its effects. There was nothing to suggest removal of jitter would increase background detail or deepen bass or smooth treble or anything like that. I did seem to me that such changes would be more down to the system as a whole and not necessarily related to jitter. If anything removal of jitter would just make the detail, bass, treble sound different and probably better.

It is that elusiveness of a sound and how do we quantify it in terms of sound quality that means I am not saying for one moment that jitter is the answer to digital cable debates. However, we know how much jitter is there as it can be measured and it would appear to be a bad thing, as very high jitter levels will make music sound not right.
 
A

Anonymous

Guest
idc:Dom Tych:I'm super non-technical, so is there a short answer to this? I use a USB connection from my laptop to my soundcard (Serato Scratch D/A converter), then to the preamp (mixer).
Is the type of USB cable I use going to make a difference, or not?
If I ask at a computer shop, the answer is no, cables make no difference. But as mentioned in the original post, timing of data transmission isn't so important in most computer applications - just accuracy. Whereas it's critical for my setup (not only playing music, but matching two tracks, with timecode information from the vinyl transmitted to the PC via the soundcard, then back to the soundcard as a music file.

I say that the answer to that is.........maybe. Depending on the jitter in your computer and the construction of your cable and the DAC. The thing about jitter is that it is everywhere at all stages. So you may have a jitter free cable, but would never know because in a way it is not important. If your DAC (or soundcard) copes very well with jitter and reduces it to a below audible level, so what? You may even prefer the sound with jitter!

The computer shop are right that it makes no difference. In their applications for a USB cable it doesnt make a difference as error correction and getting the data there are far more important. Even if they are dealing with a sound application, so what if there is a bit of jitter there lowering the sound quality? They will never listen out for that issue.

Dom Tych:
I ask as there have been a couple of occasions where the whole thing has gone completely *** up in the middle playing a track - it just judders and stops - but only on one channel, and only for a couple of seconds. This has happened while playing out which is obviously not ideal - and only after playing for an extended period (as in several hours).
So is it the programme/computer (which I'll say I can't change), or is it possibly in the cable?
Apologies that this is not about 'proper' or conventional hifi, but I think it's a good forum to discuss anything to do with sound quality.
And in case you don't know how Serato works in terms of how it's set up, or are at all interested, take a look at somoene's home-made explanation at http://www.youtube.com/watch?v=InsEDIGYlo4 or go to serato.com.(yes they all seem to use rubbish cables in their demo setups. I upgraded to Atlas Equators and it made a massive difference)

I would say it is not the cable per say, though you should check your cables and all connections as something may be faulty or lose. That is because drop out and faults are not really attributable to jitter, something has gone wrong with transmission of the whole data. So you do kneed to look at the sender (computer) and receiver (soundcard).

cheers for that. I'll go out and buy a new 'standard' cable (and give my printer back the cable i'd pilfered).

d
 

idc

Well-known member
Before this post drifts off, please remember it and feel free to link to it should cable debates in the future descend into the round and round deabtes of old. It will at least give a new dimension to such debates.
 

TRENDING THREADS

Latest posts