Digital Cable Confusion

tigetspill

New member
Aug 15, 2009
19
0
0
Visit site
Hi,

I am a bit confused over the issue of the quality of digital cables (optical, coax, hdmi).

I dont understand how the quality differs. Digital transfer uses a set of discreet bits (1s and 0s). The various protocols and standards define how the data is carried over these cables to ensure there is no change in the data going in one end and out the other - through data reduncancy and error correction algorithms. These algorithms are used to ensure there is no interference, and if thre is, there is redundancy in the protocols to allow interference to be corrected by the interface at the output end.

Can someone educate me on where my understanding is going wring?

Thank you
 
A

Anonymous

Guest
Join the club - the queue starts around Jupiter (on a good day)
emotion-5.gif
 
A

Anonymous

Guest
I could be wrong but I believe the merit of differing quality of cables has possibly been discussed before. I'm sure if you do a search for cables you'll see what I mean.

I suggest your in next post you politely ask about the merit of power cables.
emotion-2.gif
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
I'm going to offer this - yes, a lot of digital information transfers use protocols which guarantee any errors / lost information which occur during the sending process are resent to ensure data is received exactly as it is sent.

Unfortunately for audio / video data sent via digital cables, the information sent is not required a few seconds after transmission, it's required as soon as instantly as can be imagined. So if information is lost / corrupted on the way, the amplifier or TV (or whatever else we're talking about) doesn't have the luxury of saying "Sorry, didn't quite get that, can you send it again?". Instead it has to make it up.

And hence, better cables = less loss / corruption = less error correction (or guesswork) = better picture / sound.

Probably the 50th time I've posted this, but I look forward to reading why I'm wrong for the 50th time
emotion-1.gif
 
A

Anonymous

Guest
not quie right prof theres error correction

http://www.xilinx.com/support/documentation/white_papers/wp270.pdf also google mpeg4
 

tigetspill

New member
Aug 15, 2009
19
0
0
Visit site
OK. I have heard the real-time / instantanious transmission argument before. But that also confuses me. There is no need for it to be instantanious. All that is required is that it is perfectly in sequence when displayed. The display of the picture on the TV could be several seconds after it is read the source (BR, DVD, Sky box etc. or transmitted to the display. It may be buffered and processed by the TV (error correction / picture enhancement etc). In fact it would be a HUGE design flaw if it was required to be "instantaneous". How would it cope with delays in disk reads from a BR or DVD. I would guess there is buffering in the electronics at both source and display.

PS. I am NOT saying you are wrong - but I am genuinly trying to understand.

Also, I do appreciate that the best way to test this ot is to try it. BUT, I would find his difficult as I have to spend time replugging, and I dont believe I could remember the subtleties during that few minutes. Maybe that is the answer - if I cant do that, maybe it doesnt matter to me.
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
one off:not quie right prof theres error correction

I know - that's kind of the basis of my point i.e. less error correction required = better picture and sound.

As for it not being necessary for it to be instantaneous, it's a nice theory, but look at the outcry from people when one Blu-Ray player takes 20 more seconds to load a disc than another. So imagine adding another 20 - 30 seconds wait while enough of a buffer was built up to ensure the entire 2 - 3 hour movie could be played and allow for any lost data to be resent without a pause in the movie while this process occurred (and I'm not even sure 30 seconds would be enough of a buffer) - I can't see it being a popular feature!

Besides that, the buffer needs to be built into the TV and the Blu-Ray player would have to know that the TV had such a buffer - nothing works like that at the moment, so would require TV manufacturers and Blu-Ray manufacturers to work together to create a standard to ensure every Blu-Ray player manufactured from now on would work with every TV manufactured from now on - again another unlikely scenario.
 
A

Anonymous

Guest
still wrong look at mpeg4 standard and digital transmission fec.
 

TRENDING THREADS

Latest posts