HDMI leads...

Page 4 - Seeking answers? Join the What HiFi community: the world's leading independent guide to buying and owning hi-fi and home entertainment products.
A

Anonymous

Guest
Hmm. Remember that with digital video (Blu Ray and DVD anyway) the bits read off the disc aren't the same as the ones that are output from the HDMI socket. The data on the disc is compressed and has to be decoded. After decoding there's pretty much always some further processing as well (noise reduction, colourspace conversion, deinterlacing, scaling...). There's plenty of opportunity there for things to go wrong.
 
A

Anonymous

Guest
johna11:Taking the argument to its logical conclusion, I would have thought that there should actually be no difference in Blu Ray players either. They are reading a digital signal and delivering it to a digital output. It is not until analog conversion takes place that the opportunity would arrive for performance differences.

I guess that may possibly be true with 1080/24p where I think the idea is that the image is transferred to screen with little or no interpretation, but with all other modes there must be plenty of scope for differing circuitry and firmware to produce discernably different results.
 
A

Anonymous

Guest
smithdom:
Fandangio:The reason this isn't a problem is that the receiving circuitry can easily discriminate as to what should be a 1 or 0. The problem only occurs when the source signal is so badly degraded that this cannot be done accurately. Typically over short cable lengths this is never a problem but issues may occur over longer runs.

In your opinion, Fandangio,
[*]Is there instrumentation that can accurately measure errors introduced by an HDMI cable?[*]If the signal into an HDMI cable could be compared to the signal emerging at the other end, and was found to be identical, would you expect to be able to discern any difference in performance between two HDMI cables that faithfully transmit the signal?[*]If the signal emerging was not identical, in other words if errors were being introduced by the cable, how would you expect this to manifest itself in the video and audio performance?

Hi Smithdom,

This is exactly what I would suggest, put a known test signal through a number of cables with a standard receiver circuit and see what comes out.

In this manner the number of errors (if any can be calculated).

As for the effect of these errors than this is where my knowledge falls short, as errors are free to appear in any part of the signal I would guess that there may be a range of effects.

What is possible to define though is the performance of the cable itself and this should be done to potentially stop people spending a huge amount on a branded cable when a cheap and cheerful may be just as effective. From experience I honestly feel that a performance difference would be negligable unless longer leads are being used.
 

Tonya

Well-known member
Sep 9, 2008
57
3
18,545
Visit site
Well as an audio engineer, I feel I should throw my hat in the ring again here, referring to the original poster.
First off, I love this place, it provides a heady mixture of entertainment, knowledge and moments of hilarity while scoring relatively low on the troll factor.
The comments at the start of this post where both intelligent and well put, let down only by the "backhander" remark (sic).

Now to business.
When it comes to analogue signals such as line outs, speaker signals, antenn‘, etc, cable quality is paramount. Poorly made cables are usually poorly shielded which simply invites interference and induction problems.
Anyone with good hearing can clearly tell the difference between using bell wire and Monster Cable for your speaker runs.
However, when using digital connections, as long as there is a good solid electrical connection and the cable run does not exceed the specification limit, I've yet to notice any difference in "digital quality" so to speak.
Again, as long as the digital cable is well screened and not damaged, then everything should be fine.

Digital processing is a complex subject and is the topic of many a debate, only last week I was involved in a heated discussion around BluRay and CD players.
Yes they are all essentially computers reading pits off a disc but there are vast differences in quality there too.
Superior electronics and mechanics in players will extract the original data more accurately from the disc and therefore less error correction will be applied, resulting in a truer audio/video reproduction.
In my home entertainment system, I use solidly built £20 HDMI cables and when I had the opportunity to borrow a really expensive set, saw no difference whatsoever.
And I'm uber-critical.

Does a turntable supported by legs that are floating in a bath of mercury really sound better than a well engineered transcription deck on a solid base costing a fraction of the price?
If the difference even was discernable, would it be worth the several thousand pound premium?
Did you really hear the difference between M.Jackson's gold edition CDs and the standard silver ones?

To sum up, in my experience, using cables costing hundreds of pound per metre offer little or no advantage over your average well built cable, especially when you consider what's connected to both ends of it.
The components that come directly before and after the cable will affect the outcome to a much greater degree.

But as people have commented previously, you pays your money and you takes your choice.

For the record, What HiFi is essential, no, required reading in my profession and we've always found it offers a well researched and balanced source of news and information. Keep up the good work guys and don't feed the trolls! ;-)

Flame on . . . .
 

aliEnRIK

New member
Aug 27, 2008
92
0
0
Visit site
Tonya:
Well as an audio engineer,


However, when using digital connections, as long as there is a good solid electrical connection and the cable run does not exceed the specification limit, I've yet to notice any difference in "digital quality" so to speak.
Again, as long as the digital cable is well screened and not damaged, then everything should be fine. ............


For the record, What HiFi is essential, no, required reading in my profession and we've always found it offers a well researched and balanced source of news and information. Keep up the good work guys and don't feed the trolls! ;-)



So if im understanding this correctly. You trust WHF reviews, but dont trust their HDMI reviews as they SHOULD all be the same anyways?
 
A

Anonymous

Guest
I read and trust the reviewers on WHF but like any review they are subject to the reviewers own predujices and preconceived ideas to some degree. Really though, as I've tried to state, this is a measuable quality and opinion should not come into this arguement at all.
 

aliEnRIK

New member
Aug 27, 2008
92
0
0
Visit site
Fandangio:
Really though, as I've tried to state, this is a measuable quality and opinion should not come into this arguement at all.

True. But what EXACTLY should they measure and in what way? The 'eye' test is seriously flawed to me as it doesnt really prove anything

And theres another test to prove that what goes in comes out, but jitter is never taken into consideration (Meaning that what may come out the other side is the same but not in EXACTLY the right sequence)

EMI and RFI never seem to be taken into account, neither do voltages or equipment used (Meaning what produces the signal in the 1st place, the voltages used to transmit said signal and the equipment at the other end receiving said signal)

If for example a 2 meter cable is tested and it passes with 1080P what has it proven REALLY? For starters (And ive yet to find this out for sure) ive been informed that upscaled dvds actually use MORE bandwidth than a true 1080P signal. Meaning errors could well be produced simply by upscaling.

What if EMI was present? Would the cable perform as well then? What about RFI? If the dvd player is soaked in RFI then it could possibly effect the signal within the hdmi itself (Making for more errors). If the dvd player doesnt produce the correct voltages then the signal is weaker than standard to begin with. Which in turn if the tv is poor at sorting out the errors that have been created (Or struggles to read the actual 1s and 0s due to incorrect voltages) then we do actually come out with an answer that NOT all 2 meter or under hdmi cables are going to perform the same!

And of course theres the 'cliff limit'. So at some point we all know EVERY digital cable will hit the cliff and simply fail alltogether. But what about BEFORE the cliff limit? If it fails then it fails simply because the voltages have dropped to such a level that they can no longer be read. So THAT must in turn mean that as you go down the line more and more errors are produced until it wipes the lot out at the cliff edge.

There should be a LOT more to these tests than simply firing information down a hdmi in a test lab under perfect conditions
 
A

Anonymous

Guest
aliEnRIK:There should be a LOT more to these tests than simply firing information down a hdmi in a test lab under perfect conditions

Maybe, maybe not. Clare refers to the test method used as ABX testing, but her description of it doesn't seem to quite tally with the description on Wikipedia. I didn't know anything about ABX testing before this debate, but it does seem to be a rigorous way to determine if individuals can reliably distinguish between products. Since much of these debate centres on whether it is worth paying a lot for an up market HDMI cable an ABX test seems an appropriate way to settle the argument. However, the way the results are presented as described by Wikipedia are quite different to the florid prose we read in the WHF reviews. Maybe there is room for florid prose as well as the statistics emerging from an ABX test?
 

aliEnRIK

New member
Aug 27, 2008
92
0
0
Visit site
smithdom:
aliEnRIK:There should be a LOT more to these tests than simply firing information down a hdmi in a test lab under perfect conditions

Maybe, maybe not. Clare refers to the test method used as ABX testing, but her description of it doesn't seem to quite tally with the description on Wikipedia. I didn't know anything about ABX testing before this debate, but it does seem to be a rigorous way to determine if individuals can reliably distinguish between products. Since much of these debate centres on whether it is worth paying a lot for an up market HDMI cable an ABX test seems an appropriate way to settle the argument. However, the way the results are presented as described by Wikipedia are quite different to the florid prose we read in the WHF reviews. Maybe there is room for florid prose as well as the statistics emerging from an ABX test?

Far as im concerned ~ ABX blind test for hifi and av are 'generally' useless

Lookie HERE
 
A

Anonymous

Guest
In trying to measure the performance of the cable you should simply fire a test signal down the cable and see what comes out. As it is only the cable that should be measured.

On the topic of interference, digital signals are inherintly more able to cope with problems. Unless you're using the same supply when mowing the lawn (electrical motors are really noisy) there is not much equipment that is going to cause interference at levels that will impact the transmitted signal. There seems to ba a lot of smoke generated when talking about such issues.

Another good article found here...

http://pcworld.about.com/magazine/2309p111id121777.htm
 

aliEnRIK

New member
Aug 27, 2008
92
0
0
Visit site
Fandangio:
In trying to measure the performance of the cable you should simply fire a test signal down the cable and see what comes out. As it is only the cable that should be measured.

On the topic of interference, digital signals are inherintly more able to cope with problems. Unless you're using the same supply when mowing the lawn (electrical motors are really noisy) there is not much equipment that is going to cause interference at levels that will impact the transmitted signal. There seems to ba a lot of smoke generated when talking about such issues.

Another good article found here...

http://pcworld.about.com/magazine/2309p111id121777.htm

Lets say 100 hdmi cables are tested. The bandwidth is increased until each fails and the results are recorded. That would determine the measureable difference between them. Would you agree?

But no one does

All they do is pass a 1080P signal down (or whatever) and it either 'passes' or 'fails'. That doesnt mean the cables are the same, just that in THAT experiment they didnt 'fail'.

lets say 90% of the cables pass the 1080P test. Once you connect those cables into a home av system it might easily be found that 10% of those fail because the circumstances have changed

Just today ive been helping a woman with her PS3. Her hdmi lead (1.3 compliant and only 1 meter in length) showed SNOW in the menu screen when set at 1080P. Changed to a 'better' hdmi lead and the snow was gone.

Proofs in the pudding..........
 
A

Anonymous

Guest
aliEnRIK:Just today ive been helping a woman with her PS3. Her hdmi lead (1.3 compliant and only 1 meter in length) showed SNOW in the menu screen when set at 1080P. Changed to a 'better' hdmi lead and the snow was gone.

That is exactly the type of symptom that makes sense to me when a cable is not up to a specific job - unsubtle and obvious. It is not the type of distinction that we typically read about in cable reviews.
 
A

Anonymous

Guest
aliEnRIK:Fandangio:
In trying to measure the performance of the cable you should simply fire a test signal down the cable and see what comes out. As it is only the cable that should be measured.

On the topic of interference, digital signals are inherintly more able to cope with problems. Unless you're using the same supply when mowing the lawn (electrical motors are really noisy) there is not much equipment that is going to cause interference at levels that will impact the transmitted signal. There seems to ba a lot of smoke generated when talking about such issues.

Another good article found here...

http://pcworld.about.com/magazine/2309p111id121777.htm

Lets say 100 hdmi cables are tested. The bandwidth is increased until each fails and the results are recorded. That would determine the measureable difference between them. Would you agree?

But no one does

All they do is pass a 1080P signal down (or whatever) and it either 'passes' or 'fails'. That doesnt mean the cables are the same, just that in THAT experiment they didnt 'fail'.

lets say 90% of the cables pass the 1080P test. Once you connect those cables into a home av system it might easily be found that 10% of those fail because the circumstances have changed

Just today ive been helping a woman with her PS3. Her hdmi lead (1.3 compliant and only 1 meter in length) showed SNOW in the menu screen when set at 1080P. Changed to a 'better' hdmi lead and the snow was gone.

Proofs in the pudding..........

I can't comment on you're particular case and you seem to be very polarised in your opinion about cable quality (not saying that's a bad thing and if you're confident that you have hdmi cable issues I am not saying otherwise).

I have never had such issues and am unaware of friends etc having suffered issues you have encountered (not to say that they don't exist though).

Snow though, is caused by weak signal strength in analogue systems and would not be apparent in a digital system. Bit errors would not result in snow. I would really be interested in seeing the cable you have had this problem with for testing. Lack of quantifiable or purely anecdotal evidence suggests that other factors could be involved.
 

idc

Well-known member
From what I have read so far it would appear that there are good reasons why one HDMI cable can out perform another, but the differences are smaller and the law of diminishing returns kicks in sooner and with greater effect than with other types of cable.
 
A

Anonymous

Guest
OK I am going to set my stall out here. I've been reading some of the HDMI Cable reviews and cannot accept some of the attributes being ascribed to cable qualities. From a related article.....

Clear, free-flowing images
And it's soon apparent that this cable is special. Images boast amazing clarity and there's hardly any trace of noise. The cable has a firm grip on motion and allows for a fantastic smooth and free-flowing image during the opening chase of Casino Royale.

The Flat also produces a punchy picture that's lively, vivid and a real feast for your eyes.

Sonically, the Flat is anything but. It's pleasingly dynamic, entertaining and one of the most musical HDMI cables that weve come across.

Explosions are taut and well-defined, gunfire and dialogue sounds natural and in general, the cable digs up plenty of detail from movie soundtracks. Overall, then, the Flat is a superb cable and a real contender for one of this year's Awards.

A digital cable can only convey what signal is put into it. If it is working 100% all you get out is what you put in. The receiver will reconstruct the digital signal and none of the cable properties are retained, I'll say that again NONE of the cable qualites can be retained. So whereas an analogue cable can and definitely will add warmth and other properties a digital signal cannot have these properties.
 

aliEnRIK

New member
Aug 27, 2008
92
0
0
Visit site
Fandangio:

I can't comment on you're particular case and you seem to be very polarised in your opinion about cable quality (not saying that's a bad thing and if you're confident that you have hdmi cable issues I am not saying otherwise).

I have never had such issues and am unaware of friends etc having suffered issues you have encountered (not to say that they don't exist though).

Snow though, is caused by weak signal strength in analogue systems and would not be apparent in a digital system. Bit errors would not result in snow. I would really be interested in seeing the cable you have had this problem with for testing. Lack of quantifiable or purely anecdotal evidence suggests that other factors could be involved.

First up ~ your confusing me and my set up with someone elses cables and setup

Secondly ~ they were 'sparklies' (The effect when its just about to hit the 'cliff edge' before it fails entirely). Bit errors WILL and DO produce 'sparklies'

Third ~ the ONLY factor, was changing the cable
 

idc

Well-known member
Thinking out loud again. It seems to me that where one group report that there cannot be a cable effect, but another group report there being a cable effect, that something is causing the signal to pass better through one cable than another. Could it be the case that those who do not think it matters (including manufacturers) just bang out cheap rubbish that will cause the cable to degrade the signal?
 

Tonya

Well-known member
Sep 9, 2008
57
3
18,545
Visit site
Hi AliEnRik!
Perhaps you are confusing the situation here. Yes I do respect the views and reviews of the WHF team, but that doesn't mean I blindly believe in everything they (or any other review lab) say. Any review in any magazine should be treated as a reference and a guideline only, at the end of the day it's the individual who has to live with the purchase so read all about it but always audition the thing first if possible. For all the negative publicity the Sony W series of LCD monitors caught both here and other places, I went out and accquired a KDL-52W4500 to test and it performed so well that I bought one for myself. And when it comes to actual musicality of products, as in emotional or percieved, I personally think we lost the plot with the move from analogue.
It's amusing when people talk about fine, almost imperceptable nuances in digital music reproduction when what we are listening to is merely a sample (literally) of the original waveform.
All forms of compression introduces huge audio/visual imperfections, take a close look at DVDs and I defy you not to see almost lego-type artifacts in most of the encoding. Don't get me started on MP3 or MP4!
But the cold fact is that such formats can sound great on portable devices, but only when taking into consideration the convenience factor of smaller file sizes.

Again, to subquote myself, "as long as the digital cable is well screened and not damaged, then everything should be fine"

I do make it clear in my posts that I am merely expressing my own personal opinion based on my experiences.
If you can honestly see or hear the difference between a solidly made connection, be it HDMI or 220 volt power cord, and one of these supercables, then by all means spend your money. With the greatest respect to these companies, I merely think far better improvements can be achieved spending the same amount in other areas of your system.

Don't confuse obviously faulty cables that will generate problems with a good, well made electrically sound cable.
But hearing astounding differences by changing the mains plug?
Sounds like The Emporer's New Clothes Syndrome methinks ;-)
But, whatever makes you happy . . . .
 

idc

Well-known member
Tonya: But hearing astounding differences by changing the mains plug?
Sounds like The Emporer's New Clothes Syndrome methinks ;-)

The language used to try and describe the differences in all tweeks/upgrades, be it cable, stand or otherwise does tend to be exaggerated. The word astounding gets used a lot. In the end I do not think that any cable is capable of making a truely astounding difference. To get that you would need to upgrade from a 20 year old boombox to a top of the range Cyrus set up, for example. It is such a usage of language that helps to polarise these debates. In all honesty cleaning a dirty TV screen will make more of a difference to the picture quality than any HDMI upgrade. That should put a bit of perspective on what is meant by the difference HDMI and other cables can make.
 
A

Anonymous

Guest
Despite not believing a single word about digital cables being different from each other - I really would appreciate a double blind test carried out by WHF. Quite what you'd do with your reviews of digital cables if the result was 'they're all the same' might be an issue of course :)
 

idc

Well-known member
Hi Tetsugaku. For What Hifi's numerous digital cable reviews check here and for HDMI and video check here and for a comprehensive discussion on the merits, benefits and flaws to blind testing check out here. Unless you totally mistrust the views of the review team, in which case what is your experience and evidence that would back that mistrust up? I think that you be a bit more openminded and think why is it that so many people do report differences? I would agree with your view pint if it was only a tiny minority who say there is a difference and they could be dismissed as a mad fringe coming out with ridiculous comments, but it is not.
 
A

Anonymous

Guest
idc:Hi Tetsugaku. For What Hifi's numerous digital cable reviews check here and for HDMI and video check here and for a comprehensive discussion on the merits, benefits and flaws to blind testing check out here. Unless you totally mistrust the views of the review team, in which case what is your experience and evidence that would back that mistrust up? I think that you be a bit more openminded and think why is it that so many people do report differences? I would agree with your view pint if it was only a tiny minority who say there is a difference and they could be dismissed as a mad fringe coming out with ridiculous comments, but it is not.

Well i say what I say because I'm blessed with some understanding of both the technology and psychology, I use digital cables for my networking and for my day to day job all the time - not once have I ever seen any evidence that would back up the opinion that an expensive HDMI cable would be better than a competantly made cable within spec.

I have however seen a great deal of studies on medical placebos, self delusion and other psychological issues - to put it bluntly we believe what we want to believe.

For years and years analogue cables were quite rightly judged on their signal transmission, an analogue signal however is not a digital signal and I cannot help but agree with the original poster that when any review organisation comes out with statements that are scientifically impossible - they are deluding themselves, however honestly they believe their opinions and conclusions.

I trsut the WHF review team purely on their reputation - I'm not a hi fi buff in any sense, I was just broguth up o have good taste in musical equipment (Dad's Kefs are older than me at 31). When a trusted review site starts to talklike Mystic Meg - they lose credibility and I can't help feeling that a reticence to conduct an independently adjudicated double blind test is to do with not losing face.

A double blind test would really help settle things, or can anyone think of any reason why there shouldn't be one?

:)
 
A

Anonymous

Guest
This debate still going on?

I must say I ummed and arrrd about this and for my own piece of mind went and bought an expensive cable to try it for myself, 15m so it wasn't cheep and a bit of a gamble.

What I found was..There's no point telling anyone because either I can't prove it or It's all in my mind depending on my conclusion.
 

idc

Well-known member
OK, so here is a theory as to why one HDMI cable can be different from another and so produce a different result. My source of my theory is an advert for a QED HDMI cable in last months What hifi (quote your sources). 'In an HDMI cable the signal changes direction around 4.5 billion times a second, so it is fully accepted that errors will occur.....HDMI products use complicated electronic error correction....simply to eliminate the risk of such errors....'. So QED state they provide 'significantly higher levels of headroom in our HDMI cables-extra tolerance.........to minimise any risk of signal error.'

From previous posts there does appear to an acceptance that errors can occur, but over long runs of cable. QED are stating that errors occur at all times and the action that they take to reduce such errors. Could it be a case that errors do occur over short runs and do QED's claims make scientific sense?

Del Smith, please share your thoughts, cable, your mind or both acting together!
 

TRENDING THREADS

Latest posts