Classic Case of Bad Advice

admin_exported

New member
Aug 10, 2019
2,556
4
0
Visit site
Having Sky HD installed tomorrow; hope it's as good as people say it is, and upon querying with the engineer the quality of the free HDMI lead that he is going to use, his words were "that with an HDMI connection there is no difference between a £2.00 lead and a £200.00 lead".

Think I'll just let him install and then change the lead after he's left and it's all up and running!

Still at least with that level of knowledge there will always be a job for him at Comet, Currys etc if he ever fancies a career move!
 
A

Anonymous

Guest
Um, he's right, sort-of. HDMI is a digital connection, and incorporates error correction for signal degradation, so either the error correct has reconstructed a perfect signal, or if the degradation is too much you'll quickly get no signal at all (with possibly a small range where you get serious and obvious corruption in between).

Over a short distance, even a very cheap HDMI lead should give a perfect result. However, over long distances your cheapo cables will quickly get to the point where you've got no signal at all.

So if you're just connecting your Sky box behind the back of the telly, expensive HDMI cables are a complete waste of money. On the other hand, if you've a projector 10m or more from your source, and find you've no picture over HDMI, that's when you need to fork out on some Monster.
 

Clare Newsome

New member
Jun 4, 2007
1,657
0
0
Visit site
[quote user="MalcolmH"]

Over a short distance, even a very cheap HDMI lead should give a perfect result.
[/quote]

'Should' being the operative word. In actuality, there are clearly visible differences, even over a 1m length, if you upgrade to a better-quality cable. The good news is you don't have to spend a fortune - decent QED or Chord HDMI cable @£45 would be an excellent upgrade.
 
A

Anonymous

Guest
totally agree - traditionally the advise has always been not to skimp on interconnects (read somewhere it should be like 10% of your spend ?!) but for the vast majority of people HDMI connection will be just from their DVD recorder / set top box to their TV (90% 1m leads ?)
 

Clare Newsome

New member
Jun 4, 2007
1,657
0
0
Visit site
Again, Oldphrt, that's your opinion, not based on testing many, many HDMI cables. Why do you seem so determined to stop people even trying to attain a better AV experience?

Surely trying a 1m HDMI cable on money-back guarantee can't be a bad thing for any user to do?
 

Thaiman

New member
Jul 28, 2007
360
2
0
Visit site
Oldphrt, I am quite scepting about many cables, like yourself but I have to say my old friend the scarts leads and HDMI cables do make a lot of different!
 

Andrew Everard

New member
May 30, 2007
1,878
2
0
Visit site
[quote user="Clare Newsome"]

Again, Oldphrt, that's your opinion, not based on testing many, many HDMI cables. Why do you seem so determined to stop people even trying to attain a better AV experience?

Surely trying a 1m HDMI cable on money-back guarantee can't be a bad thing for any user to do?

[/quote]

Leave 'im, guv - 'e's not worth it!

 
A

Anonymous

Guest
[quote user="Clare Newsome"]

Again, Oldphrt, that's your opinion, not based on testing many, many HDMI cables. Why do you seem so determined to stop people even trying to attain a better AV experience?

Surely trying a 1m HDMI cable on money-back guarantee can't be a bad thing for any user to do?

[/quote]

Who offers a money back guarantee? i'll give one a whirl!
emotion-2.gif
I can see the the point in upgrading hdmi cable's ( In fact i have) from the standard fare to something a bit better ie shielded ect ect. Ive got a few hundred pounds worth of analog cables behind my hi fi rack that clearly do make a difference but what i cant see is how an hdmi can "give better colour" this is a technical impossibility in the digital domain its pretty much all or nothing, A small amount of data loss can be tollerated by a display before the picture cut's but this clearly identified by white flecks "snow" appearing on the display panel. I work for a large manufactuar of display panels admittedly computer panels and im only a lowly repair tech guy but i have spoken to guys in R&D to confirm this.

Sorry no offence intended this is just my opinion!
emotion-42.gif
 
A

Anonymous

Guest
I tend to agree with the 'cheap is best' camp - as far as digital interconnects go anyway.

Whether it's optical for your audio or HDMI for your picture, you're sending a digital data stream. If there's a failure in the data stream, this will lead to fairly noticeable corruption of the picture. I can't personally see how it's possible for things like contrast or colour to be affected since the picture isn't split down in the ways it used to be in the analog domain - RGB or component leads, or composite leads can all suffer this degredation because there's a degredation of the analog signal.

A degraded digital signal simply leads to artefacts and a loss of picture. It's the same deal when you get hiccups on your digital sky/cable/freeview.

Would be good if a real electronics guru were to pitch in :)

EDIT:

I'd just like to point out that I'm not trying to say there couldn't possibly be any difference between HDMI cables as far as picture quality is concerned, I'm saying I can't see *how* it would be possible given what I know about how digital data streams are transmitted, error correction, etc. As I said, I'm no electronics engineer or digital data compression specialist, so it would be nice if we could hear from someone who *knows* rather than hearing subjective opinion. (I'm aware my own opinion is subjective!)
 
A

Anonymous

Guest
Is there any error correction at the receiving end of one of these cables? If there is and buts are being lost, this might well cause degredation of picture quality.

I have no posh telly or HT stuff and it's all meaningless to me though!
 
A

Anonymous

Guest
There is error correction, but as someone has pointed out, that would result in artefacts on screen, or a loss of pixels or complete loss of picture - not in 'dull colours' etc. Digital data streams simply don't work that way.

I'm an eternal skeptic, and while I wouldn't go so far as to say that the contents of What Hifi are biased, I would say that it's in What Hifi's interest for there to be a difference between the various types of cable (and everything else). If there were no difference there would be very little to review, after all.

Until someone sticks a £400 HDMI cable in an oscilloscope and analyses the output, and then compares that to the output of a £2 HDMI cable, and tells me there *is* a difference, I'm going to believe there isn't.

As yet, nobody has provided any scientific empirical evidence to support there being any difference between one HDMI cable and another (and the same goes for optical cables - it's a bit of glass after all).

It's rather the same as the evolution vs. religion debate.
Nobody has provided me with any scientific evidence to support the
existence of God, so as far as I'm concerned, He's not there.

I think if What Hifi put out an issue with some serious scientific trials of equipment and interconnects, they'd sell out in no time - why doesn't this happen? Because it's not in their interest, or in the interests of the manufacturers who keep What Hifi, Home Cinema Choice, and all the others going through advertising and by simply existing and providing them with stuff to review.

I'm not blaming What Hifi here - we have the same situation with What Car?, Evo, all the various car mags, and all sorts of other industries. Much of what we see these days is marketing hype, and advertising budgets - big business does not cohabit effectively with the needs of the consumer.

I'll accept there's a difference in picture and sound quality when comparing various analog interconnects - there's good scientific data to back this up - however the differences aren't as massive as people would like us to believe. With analog however there's some justification for spending a bit more, since different interconnects can produce slightly 'different' sounding output due to the way the signal degrades (or is preserved depending on your point of view). However this does mean that people could subjectively 'prefer' the sound of a cheaper interconnect compared to a more expensive one.

I think a good rule of thumb is that if there's any processing to be done, or if there's any analog circuitry involved (dvd players, cd players, amplifiers, speakers, etc, etc), then it's good to spend a little extra, and take note of the reviews - also listen and see for yourself. If there's no processing, and no analog then you're talking about a pure data stream. It'll either work, or it won't (and it'll be obvious it's not working).
 
A

Anonymous

Guest
It's a bit of glass is it?

Ever tried to look through a stained glass window in a church before? Probably find that the glass can do very strange things to the view!

I swapped my Cambridge Audio optical cable for an IXOS KHD608 coaxial cable between my DVD player and Amp and found a HUGE difference. Previously I'd been on the view that digital data was too distinct to be able to suffer a "sound quality" issue. The truth is that data loss occurs far more than I'd realised.

If there is error correction in HDMI then perhaps a well made cable over a short distance will be much of a muchness, but the very cheapest will be more lossy than the error correction is likely to be able to correct. It would seem logical, having experienced the difference that a digital cable can make, to buy as well made a cable as you can, within a reasonable budget if your kit is sufficiently high quality that there is a signal worth maintaining!!
 
A

Anonymous

Guest
Will,

Unless I've misunderstood your reply you're not comparing like for like.

You swapped an optical cable for a coaxial cable.

These are two different technologies with two different data transport methods. The data is processed differently, and will therefore likely produce a different sound. This is a function of how the amplifier processes the signals from two different transports - not a function of the quality of the cable.

It's the same as the argument for HDMI vs. Component video - these are two different technologies and the resulting picture quality depends very much on the processing of the image on both ends of the cable.

Some TVs are better at processing a component video image than they are an HDMI image. It's all about the processing capability of the set - and the same goes for optical vs. coaxial interconnects.

Unless you have directly compared two different optical cables, I'm afraid my argument still stands.

I'm more than willing to be talked round, but this requires a more scientific approach than anyone has thusfar demonstrated.
 
A

Anonymous

Guest
[quote user="louisv6"]

There is error correction, but as someone has pointed out, that would result in artefacts on screen, or a loss of pixels or complete loss of picture - not in 'dull colours' etc. Digital data streams simply don't work that way.

[/quote]

Hi Louisv6,

firstly, I should remind you that HT is completely alien to me and I have no experience of it at all! But I disagree without about the error correction. If there is error correction, it will guess what should be where the data is missing, based in the bits around it. So, if there is error correction, there wouldn't be missing pixels, just maybe not exactly the right ones, shirley?
 
A

Anonymous

Guest
Hi there,

You're right that if there's very minimal data loss, you wouldn't get 'missing' pixels as such - you'd end up with an artefacts type situation. With more obvious data loss, you may get missing chunks of screen - but it all depends on what kind of data and how it's being processed.

Also the error correction used isn't exactly brain science - we're talking about filling in a missing pixel (or sound bit) with *something* in order for the whole to be processable. We're not talking about making sure it's as close to its surrounding bits as possible.

In my experience it's pretty obvious where there's data loss, and the same is true of optical cables.

However, like I said before, I'm no expert, I'm just going on my own experience of digital data handling and transmission (I've worked in IT for many years now), which is why it would be nice to hear from one (an expert, not a data transmission...).

L
 
A

Anonymous

Guest
[quote user="louisv6"]

Will,

Unless I've misunderstood your reply you're not comparing like for like.

You swapped an optical cable for a coaxial cable.

These are two different technologies with two different data transport methods. The data is processed differently, and will therefore likely produce a different sound. This is a function of how the amplifier processes the signals from two different transports - not a function of the quality of the cable.

It's the same as the argument for HDMI vs. Component video - these are two different technologies and the resulting picture quality depends very much on the processing of the image on both ends of the cable.

Some TVs are better at processing a component video image than they are an HDMI image. It's all about the processing capability of the set - and the same goes for optical vs. coaxial interconnects.

Unless you have directly compared two different optical cables, I'm afraid my argument still stands.

I'm more than willing to be talked round, but this requires a more scientific approach than anyone has thusfar demonstrated.

[/quote]

Well I thought that a digital signal was meant to convey information, and that that information was the same data being sent to the DAC to be decoded. Not different data be it a coaxial route or an optical route. I agree that for an optical cable to function the data stream must be converted to light, collected and converted back from light, but it is the same data.

HDMI is also a digital connection. But component is not. It's analogue. They are not comparable.

Coaxial and Optical are comparable as the data they are carrying IS the same. If an Optical cable fails to convey the same detail and resolution of sound, it clearly can't be doing it's job very well. So I'm afraid I can't agree with your post above.
 
A

Anonymous

Guest
Will,

I'm aware that component and HDMI cannot be compared directly. The point I was making is that on some TV sets, a component connection will look better - highlighting the fact that the way in which the image is processed is (in that instance) just as important (if not more...) as the interconnect being used.

My understanding is that the same is true for an optical data stream and a digital coaxial data stream (and yes, I'm aware they're both digital and carry the same data).

With the optical data stream there's still an element of processing going on to convert the light to a format which the DAC can understand, and thus digital optical and digital coaxial aren't necessarily comparable, although you're right that we're dealing with 1s and 0s, so there shouldn't be a difference.

Nevertheless you haven't compared like for like, and neither of us are qualified enough to say that there's definitely no difference between an optical transmission and a digital coaxial one on your particular piece of equipment.

It is in fact well documented that there are differences (albeit subjective ones) in sound quality between the optical connection and the digital coaxial connection on various pieces of equipment, so I'm in no way surprised about your comments.

What I said still stands - until I see empirical evidence that there's a difference between one HDMI cable and another, or a difference between one optical interconnect and another, I won't be convinced.

I understand you're trying to make a point, and for all I know, you might be right, but without any conclusive evidence to convince me, I'm afraid I remain unconvinced.
 
A

Anonymous

Guest
What I've seen so far in the way of responses seems to be a game of one-upmanship - I'm right and you're not, or I know better than you do.

The fact is, I'm not claiming to be an expert, I'm simply saying that if there's a difference between two optical cables, or two HDMI cables, then prove it.

I want a direct, scientific comparison of one cable vs. another of the same type.

People get very excitable about cables, and that's understandable. They don't want to find out they wasted (in some cases) hundreds of pounds on a placebo. That's no reason to take it out on me. Find me proof that I'm wrong, and I'll gladly accept that I am.

I'm not looking to appear more educated, or holier than thou - I'd like to become better educated about this subject, as I freely admit, I don't know much - but what I do know conflicts with what I see in marketing, and the shops. If I am wrong, I'd like to know why - in scientific terms. Not cause some bloke down the pub said so.
 
A

Anonymous

Guest
[quote user="bloney"]Is there any error correction at the receiving end of one of these cables? If there is and buts are being lost, this might well cause degredation of picture quality.[/quote]

If that's how HDMI error correction works I'm distinctly unimpressed. In the data networking world (TCP/IP, Ethernet and all that) error correction would be "I've lost a bit, can you send it again please?" not "I've lost a bit, I wonder what it might be?". Assuming HDMI uses the first method, you'll get a perfect bitstream until you get a highly degraded signal, at which point it all goes downhill very fast. So you should never get a slightly sub-optimal picture signal, just a pure one or a badly broken one, when it'll be perfectly obvious you need a better cable (or at least, that you need to fix something).

Note that this is completely different to analogue; whoever made reference to better quality from upgraded component cables, I couldn't agree more.

Mind you, I've just discovered in another thread that CD players don't work like that (having no proper error correction), even though they're digital devices and essentially similar to CD-ROM drives which do. Go figure.
 
A

Anonymous

Guest
[quote user="MalcolmH"]

[quote user="bloney"]Is there any error correction at the receiving end of one of these cables? If there is and buts are being lost, this might well cause degredation of picture quality.[/quote]

If that's how HDMI error correction works I'm distinctly unimpressed. In the data networking world (TCP/IP, Ethernet and all that) error correction would be "I've lost a bit, can you send it again please?" not "I've lost a bit, I wonder what it might be?". Assuming HDMI uses the first method, you'll get a perfect bitstream until you get a highly degraded signal, at which point it all goes downhill very fast. So you should never get a slightly sub-optimal picture signal, just a pure one or a badly broken one, when it'll be perfectly obvious you need a better cable (or at least, that you need to fix something).

Note that this is completely different to analogue; whoever made reference to better quality from upgraded component cables, I couldn't agree more.

Mind you, I've just discovered in another thread that CD players don't work like that (having no proper error correction), even though they're digital devices and essentially similar to CD-ROM drives which do. Go figure.

[/quote]

Hi Malcom,

If I remember correctly Hamming code can recover a certain number of lost bits at, I presume, some overhead rather than just ask for the data to be re-sent. There's obviously lots of algorithms out there in data transimssion world for error correction that you'll certainly understand better than me, but I'd be interested to know if/how it's done in audio.

Bloney
 
A

Anonymous

Guest
[quote user="bloney"]If I remember correctly Hamming code can recover a certain number of lost bits at, I presume, some overhead rather than just ask for the data to be re-sent. There's obviously lots of algorithms out there in data transimssion world for error correction that you'll certainly understand better than me, but I'd be interested to know if/how it's done in audio.[/quote]

I very much doubt I know audio better than you. You raise a very good point, it's certainly possibly to encode some redundancy in the signal, so that it's unnecessary to retransmit.

A simple retransmission protocol might go: "data data data checksum". If checksum doesn't match some simple function of "data data data", then ask for a retransmission. However a more sophisticated protocol might ask for "data data data data data data checksum", where checksum would not only show if one of the data elements had been corrupted in transmission but you'd be able to calculate from the combination of the checksum byte and the data which byte had been corrupted and what the corrupted byte would have had to have been for the checksum to match. For a well-known example of this in the real world, that's what happens in RAID5 drive arrays, where the any one of (say) four hard drives can fail and yet the whole array can be rebuilt from what's left.

That said, whether we're talking about a simple checksum and retransmission, as I originally suggested, or about a certain level of redundancy in the data transmission that avoids the need for retransmission, either scheme would create a bit-for-bit perfect replication of the data. We are NOT talking about "guessing" what the corrupted data might have been, along the line of "data1 missing-data data3; oh well, let's assume missing-data has a value of half-way between data1 and data3"; that would indeed give a degraded picture. I admit, I don't know that HDMI uses a scheme like you or I suggested, and not like the crappy last "guessing" scheme I describe, but I'm confident the last scheme is not what done is simply because it would be crap, because the better methods are very well known, and because the better methods would beeasier to implement than the poor one.

In case anyone misunderstands me, I am not saying higher quality HDMI cable has no value. Any signal attenuates over distance, and faster / over a shorter distance with a worse quality cable. I'm just saying that over a given distance you cable is either working well, or obviously broken. A high quality cable will work over a longer distance before it appears broken than a poor quality cable will do.

I've every respect for the What Hi-Fi editorial team (I wouldn't be here otherwise), but I'm really struggling to reconcile what I know about the principles of digital signal tranmission with claims that high quality digital cables are as important as high quality analogue cables, regardless of distance. Until I hear a supporting rationale that at least addresses the (pretty bog-standard orthodox) theory of signalling I've set out above, it strikes me as simply a more convincing hypothesis that subjective claims that "the more expensive cable looked better to me" are merely examples of the placebo effect. In saying this I don't mean to be rude to the WHFSV team, or to other who maintain the difference in performance, but please understand that (without further information) their claim amounts to an accusation that the HDMI specification committee must have been idiots, unaware of well-known principles of digital signalling. You just don't get on a spec committee if you're an idiot (well, maybe a DRM committee :p ).
 
A

Anonymous

Guest
Malcolm,

A much more eloquent post than I could have mustered, and making some nicely valid points.

We just need to find someone involved in HDMI...

L
 

TRENDING THREADS

Latest posts