Hi,
I am a bit confused over the issue of the quality of digital cables (optical, coax, hdmi).
I dont understand how the quality differs. Digital transfer uses a set of discreet bits (1s and 0s). The various protocols and standards define how the data is carried over these cables to ensure there is no change in the data going in one end and out the other - through data reduncancy and error correction algorithms. These algorithms are used to ensure there is no interference, and if thre is, there is redundancy in the protocols to allow interference to be corrected by the interface at the output end.
Can someone educate me on where my understanding is going wring?
Thank you
I am a bit confused over the issue of the quality of digital cables (optical, coax, hdmi).
I dont understand how the quality differs. Digital transfer uses a set of discreet bits (1s and 0s). The various protocols and standards define how the data is carried over these cables to ensure there is no change in the data going in one end and out the other - through data reduncancy and error correction algorithms. These algorithms are used to ensure there is no interference, and if thre is, there is redundancy in the protocols to allow interference to be corrected by the interface at the output end.
Can someone educate me on where my understanding is going wring?
Thank you