HDMI leads?

admin_exported

New member
Aug 10, 2019
2,556
4
0
Visit site
This is a very contentious issue and one that could be put to bed once and for all(i think ;-)
Im hoping that someone with a better understanding of subject will correct or comfirm my spoutings but given the correct equipment it would be easy to qualify wether a hdmi lead makes any difference to picture quality. On a PC network connection its very easy to see packets sent and packets lost, and im assuming with packets lost thats when error correction kicks in and this could potentially effect picture quality and again another assumption that this is the same with HDMI??? If this IS correct and differnt cables are more lossy (new word) than others then surley this could end the argument? Is there anyway of testing this?

Hope this makes sense

PS this im fence sitting here and not looking for a fight the above could be nonsense
please be gentle.....
 
A

Anonymous

Guest
Personally I don't buy it. I am not saying there ISN'T any difference, but there is no data loss in a working lead. Data loss (or rather losing of bits of information) in an HDMI lead would lead to very visible problems on the screen, NOT colour, hue, saturation or vague noise differences. So it is put down to error checking working harder in the receiver of the signal, TV or AV amp. Well this to me sounds crazy. Surely a dedicated modern chip will either be able to cope with the error checking, or not. I can't believe in the current age where a phone is more powerful than a 10 year old PC, that small differences in how hard an error checking circuit needs to work, would make ANY difference.

Again, I am not insisting there are no differences, but in all 3 of the leads ranging from £10 to £50, that I have tried, I have found zero obvious difference. If you can see one, then go ahead and buy the expensive lead. Personally I stick to well constructed, HDMI compliant cables now, and am very happy.ÿ
 
A

Anonymous

Guest
I can see the error correction might make a differnce and id be prepard to spend more on a cable if 100% of the data reached its destination as opposed to 95% on a rubbish cable
 
A

Anonymous

Guest
so you would pay say 500% increase for a 5% improvement in your hdmi cables, and still not really get any improvement, or very little i have 2+ cambridge azur hdmi £69.99 and 2+ £9.99 hdmi from argos and cant see any improvement in the dearer cable , more money wasted on so called upgades a fed up so called audiophile thats me..
 
A

Anonymous

Guest
A response from the WHF team may clear this up as im in the same boat wondering what hdmi to buy.
 

The_Lhc

Well-known member
Oct 16, 2008
1,176
1
19,195
Visit site
garethwd:I can see the error correction might make a differnce and id be prepard to spend more on a cable if 100% of the data reached its destination as opposed to 95% on a rubbish cable

taking the comparison with networking you only lose packets on networks if there is a hardware issue or sometimes, due to network saturation, when many devices are trying to talk across the same physical infrastructure. With HDMI, you don't have this, you have one device talking directly to a second device, via a dedicated link. I would be astonished if you would get lost or dropped "packets" (if that's how HDMI even transmits information, I don't know) in that kind of setup. There's something VERY wrong if you lost infomation like that.
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
The issue with comparison between data loss over a network between two PCs and data loss on HDMI is they work very differently. As you say, on a network, you can look at packet loss statistics and, in the event of a packet being lost of corrupted, the TCP/IP protocol simply requests that packet be resent. TCP/IP is an extremely robust protocol developed by the US military to look after communications during World War 3, so it has a lot of error correction built in. This is fine for looking at web pages and the like as a few milli seconds latency whilst packets are resent is not going to be noticed. However, with high definition video and sound, even a few milli seconds will be very much noticeable so error correction is going to have to work in a very different way.
I've absolutely no idea how it works over HDMI, but one thing is for certain, it's not using packets or the TCP/IP protocol as this simply wouldn't work. So I'm not sure how easy it would therefore be to measure. I'm not saying it's impossible, I'm just saying that just because it's easy to do on a computer, doesn't mean it's easy to do with an HDMI cable.
 
A

Anonymous

Guest
Go here http://www.hdmi.org/ and all will be revealed.

A cable is either HDMI compliant or it's not. You have to be licenced to make a claim of a compliant product and the web site explains the 2 current HDMI standards and how to identify a cable that meets your needs.

In a nutshell if it's properly made it'll do the job. Everything else is marketing hype.

Note - I'm not saying you might not be able to percieve differences in picture quality but thet will be very, very, very subtle ones.
emotion-5.gif
 
A

Anonymous

Guest
If the HDMI cable manufacturers cannot come up with a single scientific reason as to why we should buy their cables and post it for us to see, what exactly are all these arguments based on? Marketing.

They have been asked, asked and asked again, so have all the magazines who choose to conduct reviews on these cables to produce scientific evidence. They don't. To Chord, QED, VDH come on, give us your reasons for charging the hard changed buying public this ludicrous sum.

My mind was made up long ago. I think the lack of support behind the price tag should make people think more than they maybe do.ÿ
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
Well, as I've said a few times, I trust my eyes. I could see a better picture with my Chord HDMI cable when I bought it, so I kept it. If that means I'm deluding myself, so be it - the fact is, I still see a better picture.
Equally, if I couldn't see a better picture but someone was able to scientifically prove to me it was better, I still wouldn't keep the cable - what would be the point if I can't see the difference?
It all seems fairly simple to me, but then I'm a simple man I guess
emotion-1.gif
 
A

Anonymous

Guest
If you go to the list of approved connector makers on the site I mentioned you'll not find any of the well known high street names present. What you will find is the names of the folks who actually make the components that are then made into the various "name" company products. Now you can call me old fashioned if you like but if you buy your connectors from one Chinese firm and your cable from another and then use a third to make up your product........... I'll let you work the rest out.
emotion-2.gif
 
A

Anonymous

Guest
Given that many people will not learn about the delicate matter this subject consists of before going out and spending their money I can see why the manufacturers are keeping quiet.

Given also that we all agree that electrical analogue cables can be improved with price, and that I think everyone here has at least a lingering doubt in their mind about digital cables no matter what their electrical prowess, it's time we got a clean, universal answer.ÿ
 
A

Anonymous

Guest
Octopo:

Given that many people will not learn about the delicate matter this subject consists of before going out and spending their money I can see why the manufacturers are keeping quiet.

Given also that we all agree that electrical analogue cables can be improved with price, and that I think everyone here has at least a lingering doubt in their mind about digital cables no matter what their electrical prowess, it's time we got a clean, universal answer.

Well not everyone Octopo see my posts yesterday.

Regards HDMI and i'm no expert, but my understanding is that digital data is either a 1 or 0. If its not a 1 its 0 and if its not a 0 its a 1. The signal can never change thus not possible to improve!

I will stick by my motto until proven otherwise, "cables cable"

Please QED and the like, prove me wrong!
 
A

Anonymous

Guest
soulstyle:

Regards HDMI and i'm no expert, but my understanding is that digital data is either a 1 or 0. If its not a 1 its 0 and if its not a 0 its a 1. The signal can never change thus not possible to improve!

I will stick by my motto until proven otherwise, "cables cable"

Please QED and the like, prove me wrong!

This is true but what if the 0/1 doesnt get to its destination there has to be some form of error correction which could have an impact on picture quality???
 

jase fox

Well-known member
Apr 24, 2008
212
0
18,790
Visit site
professorhat:Well, as I've said a few times, I trust my eyes. I could see a better picture with my Chord HDMI cable when I bought it, so I kept it. If that means I'm deluding myself, so be it - the fact is, I still see a better picture.
Equally, if I couldn't see a better picture but someone was able to scientifically prove to me it was better, I still wouldn't keep the cable - what would be the point if I can't see the difference?
It all seems fairly simple to me, but then I'm a simple man I guess
emotion-1.gif


Im the same as you on this one prof hat as i could see a better pic with my QED silver ref cable, so its really down to the individual as to what they can or cant see, my advice would be to try & borrow some cables from your local HIFI store & try them for yourself & if you cant see a diff then u could save cash ! as an eg for you my mate was using a chord HDMIcable for his bluray player & wed just recently had a discussion on cables & so on so when i went round to watch a film with him, when he went to make a cuppa i secretly swapped his HDMI lead for a 9.99 vivanco, we then proceeded to watch the movie (3.10 to yuma) & when it was over with i asked him if he could tell any diff in quality & he said "no why ? then i told him what id done, ok he called me a crafty b*****d!ha but to him it answered his question ! So anybodys best bet is just try for yourselves.
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
the_lhc:
garethwd:I can see the error correction might make a differnce and id be prepard to spend more on a cable if 100% of the data reached its destination as opposed to 95% on a rubbish cable

taking the comparison with networking you only lose packets on networks if there is a hardware issue or sometimes, due to network saturation, when many devices are trying to talk across the same physical infrastructure. With HDMI, you don't have this, you have one device talking directly to a second device, via a dedicated link. I would be astonished if you would get lost or dropped "packets" (if that's how HDMI even transmits information, I don't know) in that kind of setup. There's something VERY wrong if you lost infomation like that.

This just simply isn't true - you'll get lost packets at some point even if you just have one computer connected to another via one ethernet cable. It just happens, for a variety of reasons. As I've said though, because TCP/IP was designed to withstand sites being taken out via nuclear detonations, it has massive amounts of error control built in, which means this doesn't matter as the packets can just be resent. Because of the way it works, packets also don't have to arrive in the same order they left, further improving the robustness of the error control - try that with high definition video or audio - it's not going to look or sound pretty!
HDMI signals cannot possibly work this way as they have to provide instant signals in the correct order - there's no room for any delay in the signal, so the analogy isn't a sound one I'm afraid.
 

Tonestar1

Moderator
TCP/IP is connection orientated and uses windowing so any packets that arrive with errors are resent. This only works with data which is not (comparitively) sensitive to delay.

I'm not sure what protocol HDMI uses but it is far more likely to be similar to UDP (which sends data really quickly but if the data doesn't arrive at the destination then so be it). I imagine HDMI must work in a similar way as there is no time for error check/request resend the only thing possible would be to dump the errored data. Over a cable on average 1.5 metres long and due to the huge amounts of data being sent (I would work it out but can't be bothered) I would imagine there could be a some loss of traffic. More likely to do with interference from other signals rather than the quality of the conductors though. So it may be possible to see a slight difference. However these should be easily spotted with some BERT(bit error rate test) testing. I don't see why we can't have a supertest with a techie taking some real measurements.

Also I can't understand why they didn't use fibre for these connection(price maybe) as it would be far less succeptable to electrical inteference.
 
A

Anonymous

Guest
I just want to point out i wasnt making any statments infering that good cables were a waste of time or not, I was simply ASKING maybe not not very clearly if it was possible to measure data lose in an hdmi cable and hoping that someone with a better unserstanding than me would offer some opinions
 

The_Lhc

Well-known member
Oct 16, 2008
1,176
1
19,195
Visit site
garethwd:soulstyle:

Regards HDMI and i'm no expert, but my understanding is that digital data is either a 1 or 0. If its not a 1 its 0 and if its not a 0 its a 1. The signal can never change thus not possible to improve!

I will stick by my motto until proven otherwise, "cables cable"

Please QED and the like, prove me wrong!

This is true but what if the 0/1 doesnt get to its destination there has to be some form of error correction which could have an impact on picture quality???

Why, when nothing else is using the cable, would the signal NOT get to it's destination? You still haven't given a reason for that.

Actually I can think of something, we think of digital as "1s" and "0s", which logically they are, but electrically they're actually represented by voltage levels. If, for some reason, a voltage change didn't reach the level required to trigger a 1 or 0 then you would lose information but for the cable to be responsible for that it would have to be horrifically badly made!
 
A

Anonymous

Guest
Some of you may find this interesting http://www.hdmi.org/devcon/presentations/2007_DevCon_SiliconImage_English.pdf

The simple fact of life is that the method of error checking and correction and many other parts of the HDMI standard are not in the public domain. However you can infer from what is in the public domain that it's not just a point and squirt technology. Otherwise you would not need features such as lip synch etc. Theabove presentation also provides an insight into the level of definition and how the signal is transmitted.

The non -technical amoung you can continue to debate the merits of your expensive re-branded (and possibly in some cases not even compliant) cables.
emotion-2.gif
 

pete321

New member
Aug 20, 2008
145
0
0
Visit site
If there isn't a difference, why do Monster rate their M1000HD at 14.9Gbps and their 1000EX at 10Gbps, both pretty futureproof for a while. I'm sure a cheap freebee HDMI wouldn't be good for those speeds, especially at any length.

The quality of the cable, solder, shielding and terminations must all play a part in bandwidth and signal loss prevention?
 
A

Anonymous

Guest
There are two standards for current HDMI cables as it says on the HDMI web site.

To pass a 1080p 60Hz signal the cable needs to pass the higher standard. The Monster cable exceeds this standard. However there are no devices that can exploit this cabability.

As I said before Monster don't actually make these cables they buy them, or as a minimum the components from third parties in China.

Yes there are differences in quality of construction but, put simply, a cable either conforms or it does not. There are cheap pirate cables out there that don't meet the standard. Once you get to the branded £20 per cable level this is unlikely to be the case.

Think about it like this. It is possible to pass HDMI over Cat5 and Cat6. Over long distances. Now Cat5 is Cat5 -you don't get the huge variation in cost that you do with HDMI cable as the market for Cat 5 is pro and they don't go in for the hype. Sure there are differences and I think most people junk cheap freebie Cat 5 and buy more decent stuff for patch leads etc. But there is a point beyond which nobody will go in terms of price because it's a pro market in the main.
 

pete321

New member
Aug 20, 2008
145
0
0
Visit site
welshboy:

To pass a 1080p 60Hz signal the cable needs to pass the higher standard. The Monster cable exceeds this standard. However there are no devices that can exploit this cabability.

It also passes audio, and in my set-up that includes stereo music, not just movie soundtracks. In effect it's acting like a digital co-ax in that instance, would then say that there's no difference in digital co-ax cables?

From the video angle, surely with the continuous HDMI standard upgrades it makes sense to buy a good quality lead that's likely to cope with quite a few upgrades in the HDMI standard?
 
A

Anonymous

Guest
You have slightly simplified the argument about sound but - hey you said it. digital standards are a lot more robust than analogue ones and the HDMI standard is one of the most robust of the lot currently. I'd really strongly suggest you read up on the subject and draw you own conclusions.

You have a point about future proofing but it's also a guess if you think about it. Who knows what the standard will be 3 upgrades down the line. If you've spent £20 on a cable and it becomes redundant well it's £20 down the drain but that's not even a night out. If you've spent £200 on a cable and it becomes redundant - well that's still a decent night out you could have had.

I think you also need to look at this stuff in context. My CD player didn't cost £200 so there's no way I'm spending silly money on cables.
 
A

Anonymous

Guest
The HDMI standard defines four high-speed channels carrying a pixel clock signal, three channels for uncompressed video and audio information, and two low-speed channels for control data. The total data rate of up to 5 Gbit/s allows the transmission of uncompressed HDTV signals at full high definition resolution. The recently adopted Version 1.3 of the HDMI standard allows for even higher data rates of up to 10 Gbit/s.

Yes, I copied that ! however, there is no real error "correction" on HDMI it is "bitstream", if the data does not arrive it is not re-requested as in TCP/IP. It would appear to be fudged as in CD error correction (read: fill in the gap with an average). However 10 Gigabits per second is well into the Formula 1 of data transmission speeds and therefore with a non-critical application (I.E sending/decoding domestic Video and Audio) the processor/ancillaries required to implement a type of TCP/IP would probably eclipse the cost of most consumers equipment. The pixel clock signal is the nearest thing to error correction, that should keep everything even and on time on the screen.

Verdict: Actually in short it would be very easy to test whether an HDMI cable was doing it's job well (I.E passing everything, actually transmitted through) perhaps the WH team can talk to independant labs to test the accuracy of data throughput of HDMI cables, and more importantly RFI shielding when in close proximity to all your other gear!

One final point, why was the Chord 1.3 HDMI (award winning !) cable left out of the running in the WH DEC A/V supplement?
 

TRENDING THREADS