[quote user="bloney"]If I remember correctly Hamming code can recover a certain number of lost bits at, I presume, some overhead rather than just ask for the data to be re-sent. There's obviously lots of algorithms out there in data transimssion world for error correction that you'll certainly understand better than me, but I'd be interested to know if/how it's done in audio.[/quote]
I very much doubt I know audio better than you. You raise a very good point, it's certainly possibly to encode some redundancy in the signal, so that it's unnecessary to retransmit.
A simple retransmission protocol might go: "data data data checksum". If checksum doesn't match some simple function of "data data data", then ask for a retransmission. However a more sophisticated protocol might ask for "data data data data data data checksum", where checksum would not only show if one of the data elements had been corrupted in transmission but you'd be able to calculate from the combination of the checksum byte and the data which byte had been corrupted and what the corrupted byte would have had to have been for the checksum to match. For a well-known example of this in the real world, that's what happens in RAID5 drive arrays, where the any one of (say) four hard drives can fail and yet the whole array can be rebuilt from what's left.
That said, whether we're talking about a simple checksum and retransmission, as I originally suggested, or about a certain level of redundancy in the data transmission that avoids the need for retransmission, either scheme would create a bit-for-bit perfect replication of the data. We are NOT talking about "guessing" what the corrupted data might have been, along the line of "data1 missing-data data3; oh well, let's assume missing-data has a value of half-way between data1 and data3"; that would indeed give a degraded picture. I admit, I don't
know that HDMI uses a scheme like you or I suggested, and not like the crappy last "guessing" scheme I describe, but I'm confident the last scheme is not what done is simply because it would be crap, because the better methods are very well known, and because the better methods would beeasier to implement than the poor one.
In case anyone misunderstands me, I am not saying higher quality HDMI cable has no value. Any signal attenuates over distance, and faster / over a shorter distance with a worse quality cable. I'm just saying that over a given distance you cable is either working well, or obviously broken. A high quality cable will work over a longer distance before it appears broken than a poor quality cable will do.
I've every respect for the What Hi-Fi editorial team (I wouldn't be here otherwise), but I'm really struggling to reconcile what I know about the principles of digital signal tranmission with claims that high quality digital cables are as important as high quality analogue cables, regardless of distance. Until I hear a supporting rationale that at least addresses the (pretty bog-standard orthodox) theory of signalling I've set out above, it strikes me as simply a more convincing hypothesis that subjective claims that "the more expensive cable looked better to me" are merely examples of the placebo effect. In saying this I don't mean to be rude to the WHFSV team, or to other who maintain the difference in performance, but please understand that (without further information) their claim amounts to an accusation that the HDMI specification committee must have been idiots, unaware of well-known principles of digital signalling. You just don't get on a spec committee if you're an idiot (well, maybe a DRM committee
).