Watchdog HDMI Cable Special

Page 2 - Seeking answers? Join the What HiFi community: the world's leading independent guide to buying and owning hi-fi and home entertainment products.

Glacialpath

New member
Apr 7, 2010
118
0
0
Visit site
Ro-Tang Clan said:
People are easily conned into this as the general public are still in the analogue mindset in terms of cables. For analogue, you do get a difference in quality with the way the cable is twisted and constructed, however with HDMI, it's digital. The signal is either there or it isn't. Simple. In terms of picture quality, the only thing you should be concerned about is standards. HDMI 2.0 and 1.4 offer greater bandwidth over HDMI 1.3 which means you can fit more data down the cable thus increasing colour space and offers other advantages too.

Like someone else in the thread, I had the Ibra gold HDMI cables bought from Amazon, but had them replaced for the Fisual Hollywood Ultimate sereies (link here) simply because the Fisual cables are better built and lighter which makes them slot nicely into the back of my AV reciever; unlike the Ibra cables which sagged down potentially damaging the internal board.

On the topic of expensive cables, it does make me wonder whether anyone has actually genuinely bought cables like the Audioquest Diamond HDMI cable (link here) or the Carbon Fibre HDMI cable provided by Wireworld (link here). I mean, I'm a sucker for Carbon Fibre (I own CF spectacles and a CF pen) but that's just a little on the extreme side.

Fundamentally you have just said the better cables make a difference. I for one don't believe the digital signal is changed in any way just that with poor shielded cables (i.e. cheaper) the signal getts added to thus reducing the quility of the picture because you are looking at more than just the original signal which makes the blacks lighter and the picture sligtly fuzy.
 
Glacialpath said:
Fundamentally you have just said the better cables make a difference. I for one don't believe the digital signal is changed in any way just that with poor shielded cables (i.e. cheaper) the signal getts added to thus reducing the quility of the picture because you are looking at more than just the original signal which makes the blacks lighter and the picture sligtly fuzy.

That doesn't make any sense at all. The signal is digital, it does not have different paths for blacks etc. So how can the blacks get lighter? You can get picture loss blocks if the cable is faulty, similar to the picture on Sky TV when there's a storm or snow affecting the signal.
 

Glacialpath

New member
Apr 7, 2010
118
0
0
Visit site
bigboss said:
That doesn't make any sense at all. The signal is digital, it does not have different paths for blacks etc. So how can the blacks get lighter? You can get picture loss blocks if the cable is faulty, similar to the picture on Sky TV when there's a storm or snow affecting the signal.

Hey BigBoss.

I'm not saying there are seperate paths for any part of the signal. The cable of course is only a short (relatively) link in the chain. Any electrical current surely will attrack outside interference if not shielded properly. So though the core signal doesn't change from any digital source other elements can get in on the back of the signal masking it's true quality. I don't mean anything like macro blocking as that would mean a corrupt signal i.e. as you say if there is a storm somewhere affecting a Sky broadcast.

Look at it this way once the signal what ever it is has passed through the cable, any other element the poorly shielded cable has absorbed will follow the signal through the rest of the system thus being your storm.

I know we've all argued about this many a time. I'm just saying I've removed excess cables from peoples set up like old scart leads and such that weren't actually connected any more and the picture has become more refined and the black have become darker.

Thiis is why in one such debate about cables I suggested people pile a load of cheap cables on the back of there set up and see if anything happens to the picture or sounds.

Much to my Wifes and my dissapointment I removed out Wii consol from our HT as the cables are poor quality and out TV/BD picture became clearer.
 

cheeseboy

New member
Jul 17, 2012
245
1
0
Visit site
Glacialpath said:
So though the core signal doesn't change from any digital source other elements can get in on the back of the signal masking it's true quality

nope, digital signals don't work like that. For example, remember back (if you are old enough) to using analogue modems. Signals got converted from digital to analague, then sent through some of the crappest and long wiring known to man, then when received at the other end got converted back in to digital, and providing there was no loss of data, the data was exactly the same as what was sent.

Glacialpath said:
Look at it this way once the signal what ever it is has passed through the cable, any other element the poorly shielded cable has absorbed will follow the signal through the rest of the system thus being your storm.

again, see my example above. It's still possible to extact a identical sigital signal even with interference...

Glacialpath said:
I know we've all argued about this many a time. I'm just saying I've removed excess cables from peoples set up like old scart leads and such that weren't actually connected any more and the picture has become more refined and the black have become darker.

Thiis is why in one such debate about cables I suggested people pile a load of cheap cables on the back of there set up and see if anything happens to the picture or sounds.

Much to my Wifes and my dissapointment I removed out Wii consol from our HT as the cables are poor quality and out TV/BD picture became clearer.

what you are describing there isn't a cable issue, it's an issue with the equipent not shielding properly or having proper seperation.
 

Glacialpath

New member
Apr 7, 2010
118
0
0
Visit site
cheeseboy said:
Glacialpath said:
So though the core signal doesn't change from any digital source other elements can get in on the back of the signal masking it's true quality

nope, digital signals don't work like that. For example, remember back (if you are old enough) to using analogue modems. Signals got converted from digital to analague, then sent through some of the crappest and long wiring known to man, then when received at the other end got converted back in to digital, and providing there was no loss of data, the data was exactly the same as what was sent.

You missunderstand my comment. What I mean is regardless of what conversions the signal goes through the original data is unchanged but poorly shielded cables have added to the signal meaning more info is heard or seen colouring up the original signal not changing it.

Also why would a signal be changed back to digital having been converted to analogue? Genuine question as the final portion of any set up is analugue to allow us to see or hear it is a format we understand?

[/quote]

what you are describing there isn't a cable issue, it's an issue with the equipent not shielding properly or having proper seperation.

[/quote]

So you are saying any interference into any system is through the system itself and not the cables?
 

AlbaBrown

New member
Jun 29, 2012
14
0
0
Visit site
Oh, here we go again!

Cue the old "digital is digital" brigade. All CD players sound the same, All Blurays look and sound the same, all digital cables are the same. If that were the case jitter, error correction etc wouldn't exist (let alone the effects these factors have on neighbouring components).

Unless you have a tv/projector good enough to show the difference, and you have done a comparison yourself WITH DIFFERENT CABLES (NOT just rebadged/re-engineered chinese tat - i.e Monster, Chord Company, Belkin etc which are just that) and with a reference quality source, you cannot make a definitive statement.

The two HDMI "Tests" I have seen were the Gadget Show where they ran two different priced cables (which were both based on the same cheap conductors/geometry of a chinese supplier) where the price differential was due to "marketing positioning" and the bling factor. Feeding two screens from two cheap BD players.

The second was a BBC section (possibly Watchdog) where, again two differently priced but of equally dubious origins were used on two cheap BD players again. They stated that the test was ran at 1080P and that there was no percievable difference. Funny as the screens they were using were Fujitsu 58 Series Plasmas that are (exceptionally good) 720P panels! No mention of whether the sets were calibrated/setup.

Been an owner of the 50" set I know that it's AVM2 processor is amazingly powerful at tiding up signals when the NR reduction etc is turned on (I have ALL of these modes turned off).

Which then begs the question, how many owners of TVs out there are using sets where the panel quality is so poor that these "enhancement" features need to be turned on to get the best results!!? If they are set to on, whatever signal comes down the HDMI cable will be buggered beyond recognition anyway!

I DO TOTALLY AGREE WITH CURRYS BEING CHASTISED THOUGH!

Ask any (honest staff), and for years their HDMI cables have been classed as CODE 0 on their tills. This means that if a (suspected faulty) HDMI cable is returned for refund/replacement, it is DISPOSED OF. Even the "laughably" reference leads. The trade price between their entry level and these leads is barely more than 70%!

Bare that in mind when you "review" one of their cables at £9.99 and £149.99 SRP!!!
 

cheeseboy

New member
Jul 17, 2012
245
1
0
Visit site
Glacialpath said:
You missunderstand my comment. What I mean is regardless of what conversions the signal goes through the original data is unchanged but poorly shielded cables have added to the signal meaning more info is heard or seen colouring up the original signal not changing it.

that depends what you are doing with the signal. As I said, it's possible to ignore interference on a digital signal, and there are things in place in order to do this.

Glacialpath said:
Also why would a signal be changed back to digital having been converted to analogue? Genuine question as the final portion of any set up is analugue to allow us to see or hear it is a format we understand?

My example was an old analogue dial up modem. That *had* to convert the signal in to analogue in order to send it through the phone system, then back again so the computer could understand it. The same way we used to load games on the old spectrum and commodore machines.

Glacialpath said:
So you are saying any interference into any system is through the system itself and not the cables?

No, I'm saying that the system is not good enough if it allows interference to happen like that. Any half decent system should have things in place to not allow cross contamination if you will. there's no reason why a decently designed system should allow a plugged in scart lead to intefere with say the hdmi signal. If that's happening it's a badly designed system regardless of the cables.
 

Jota180

Well-known member
May 14, 2010
27
3
18,545
Visit site
PROFESSORWILLIAM WEBB–INSTITUTE OF ENGINEERING AND TECHNOLOGYThe HDMI cable carries the video and the audio signals that make up the picture that you see on your television and it carries those in a digital format so that means it converts the signals to ones and zeros and sends those down the wire and then reconstructs the picture at the other end.Because the data is digital it’s either correct or incorrect. If it’s correct we get a perfect picture, if it’sincorrect we get no picture at all.So in a normal domestic environment a cheap cable is just good as an expensive cable. Both will provide perfect picture quality, you’ll see no difference at all between them.
 

Jota180

Well-known member
May 14, 2010
27
3
18,545
Visit site
You'll find a transcript here of the show...

http://www.bbc.co.uk/programmes/articles/h9RmpnFKD38wsv6DX1Fx4b/hdmi-cables

They chose Curry's because a customer got in touch with the show about his experience there.
 

Glacialpath

New member
Apr 7, 2010
118
0
0
Visit site
Ok yeah I get you. I used to hate waiting for games to load up on our ZXSpectrum 48k and so glad we don't use dial up any more.

That is a fair ponit but could it not be the conversion back to digital diregards any back packers as they weren't part of the digital signal in the first place so the signal is restored to it's original qiality then converted back to analogue in the computer monitor essentially only being on conversion despite the signal originating many many miles away. Also aren't those stupidly long crap cables you talk of fiber optic?

You can see what my set up is in my signature. I can't believe for one minute any of it apart from maybe the Sky box is built to a cheap poorly shielded standard but aside from removing the Wii from the rigg. I used to have a VHS player connected and taking that away and the 5m Cambridge Audio scart lead I have improved the picture.

I'm not saying I don't believe you about system component shielding just if removing the cables improves the picture in allowing the signal to present itself at it's best and not physically improving the signal as some people think that's what we mean. Then sure the cables are being the spunge for RFI and EMI and anything else.

The cables don't generate anything unless the are connected or poorly shielded thus acting as a conducter surely?
 

Jota180

Well-known member
May 14, 2010
27
3
18,545
Visit site
Where are people getting this poorly shielded cables nonsense from? Do you think anyone can set up a factory and produce HDMI cables (or any other HDMI labelled product) freely and with no regard to the standards set by the HDMI forum?

All products carrying the patented HDMI logo must be submitted for testing at an official Authorized Testing Centre and the product, cables included, must pass a series of tests and since we're on about cables, shielding is a part of that test.

Cables either surpass the set standards and are allowed to carry the HDMI logo or they fail and are not allowed to carry the logo.

Anything you buy with an HDMI logo on it has passed the required tests and you can use it with confidence.
 

Glacialpath

New member
Apr 7, 2010
118
0
0
Visit site
Jota180 said:
Where are people getting this poorly shielded cables nonsense from? Do you think anyone can set up a factory and produce HDMI cables (or any other HDMI labelled product) freely and with no regard to the standards set by the HDMI forum?

All products carrying the patented HDMI logo must be submitted for testing at an official Authorized Testing Centre and the product, cables included, must pass a series of tests and since we're on about cables, shielding is a part of that test.

Cables either surpass the set standards and are allowed to carry the HDMI logo or they fail and are not allowed to carry the logo.

Anything you buy with an HDMI logo on it has passed the required tests and you can use it with confidence.

Fair enough. I'm just trying to understand why these things happen. How high are these standards set? Even if the standars are met doesn't mean to say a more epensive cable will have a better construction and be of a higher standard the the minimum requirements set by this organisation.

After all as long as the cable meets the minimum requirements it will pass and be granted the HDMI logo. Companies will try to make these things a cheap as possible but still reach the required standards. There will be companies that construct them to a higher quality thus having to charge more. Most likely charging way to much though.

After all it could be that me having the scart lead connected is causing iterference inside the TV and affecting the picture from the HDMI cable as by thin time of course the signal has left the HDMI cable. Of course the HDMI standards of the TV will have met the requirements but once the signal leaves the HDMI port it has to go through rest of the system. So maybe it is nothing to do with the HDMI part of the loop.
 

wolf7howl

Well-known member
Feb 16, 2010
11
0
18,520
Visit site
A professional electronic engineer mate of mine has developed a healthy scepicism with regard to specifications given for consumer electronics. Differences that can be measured on testing equipment may not be detectable by real world users in live environments. WHFSV do suggest that despite the reviews, that users try out equipment for themselves. I have been building my AV setup over the last 6 months from Richer Sounds. Their advice was to never mind what other people say, if you cannot detect an improvement for the money, take the cheaper option.

Some of the advice for "tidying up" the rat's nest of cables will at least not cost anything. Even if the science behind it cannot be expalined satisfactorily, if it works, it works.
 
AlbaBrown said:
Oh, here we go again!

Cue the old "digital is digital" brigade. All CD players sound the same, All Blurays look and sound the same, all digital cables are the same. If that were the case jitter, error correction etc wouldn't exist (let alone the effects these factors have on neighbouring components).

Unless you have a tv/projector good enough to show the difference, and you have done a comparison yourself WITH DIFFERENT CABLES (NOT just rebadged/re-engineered chinese tat - i.e Monster, Chord Company, Belkin etc which are just that) and with a reference quality source, you cannot make a definitive statement.

The two HDMI "Tests" I have seen were the Gadget Show where they ran two different priced cables (which were both based on the same cheap conductors/geometry of a chinese supplier) where the price differential was due to "marketing positioning" and the bling factor. Feeding two screens from two cheap BD players.

The second was a BBC section (possibly Watchdog) where, again two differently priced but of equally dubious origins were used on two cheap BD players again. They stated that the test was ran at 1080P and that there was no percievable difference. Funny as the screens they were using were Fujitsu 58 Series Plasmas that are (exceptionally good) 720P panels! No mention of whether the sets were calibrated/setup.

Been an owner of the 50" set I know that it's AVM2 processor is amazingly powerful at tiding up signals when the NR reduction etc is turned on (I have ALL of these modes turned off).

Which then begs the question, how many owners of TVs out there are using sets where the panel quality is so poor that these "enhancement" features need to be turned on to get the best results!!? If they are set to on, whatever signal comes down the HDMI cable will be buggered beyond recognition anyway!

I DO TOTALLY AGREE WITH CURRYS BEING CHASTISED THOUGH!

Ask any (honest staff), and for years their HDMI cables have been classed as CODE 0 on their tills. This means that if a (suspected faulty) HDMI cable is returned for refund/replacement, it is DISPOSED OF. Even the "laughably" reference leads. The trade price between their entry level and these leads is barely more than 70%!

Bare that in mind when you "review" one of their cables at £9.99 and £149.99 SRP!!!

1) I have tested different cables on my Kuro, the best TV of its time and found no difference.

2) Please explain expert reviews test of different HDMI cables where they examined every pixel objectively and found no difference.
 

Jota180

Well-known member
May 14, 2010
27
3
18,545
Visit site
Glacialpath said:
Jota180 said:
Where are people getting this poorly shielded cables nonsense from? Do you think anyone can set up a factory and produce HDMI cables (or any other HDMI labelled product) freely and with no regard to the standards set by the HDMI forum?

All products carrying the patented HDMI logo must be submitted for testing at an official Authorized Testing Centre and the product, cables included, must pass a series of tests and since we're on about cables, shielding is a part of that test.

Cables either surpass the set standards and are allowed to carry the HDMI logo or they fail and are not allowed to carry the logo.

Anything you buy with an HDMI logo on it has passed the required tests and you can use it with confidence.

Fair enough. I'm just trying to understand why these things happen. How high are these standards set? Even if the standars are met doesn't mean to say a more epensive cable will have a better construction and be of a higher standard the the minimum requirements set by this organisation.

After all as long as the cable meets the minimum requirements it will pass and be granted the HDMI logo. Companies will try to make these things a cheap as possible but still reach the required standards. There will be companies that construct them to a higher quality thus having to charge more. Most likely charging way to much though.

After all it could be that me having the scart lead connected is causing iterference inside the TV and affecting the picture from the HDMI cable as by thin time of course the signal has left the HDMI cable. Of course the HDMI standards of the TV will have met the requirements but once the signal leaves the HDMI port it has to go through rest of the system. So maybe it is nothing to do with the HDMI part of the loop.

HDMI goes to further lengths to protect the signal by utilising a number of different technologies...

http://www.ramelectronics.net/tmds.aspx

At the end of the day, after all that any interference is going to be so minute you're not going to notice it. Save £100 on cables and buy some movies!
 

Ro-Tang Clan

New member
Oct 22, 2013
6
0
0
Visit site
Glacialpath said:
Ro-Tang Clan said:
People are easily conned into this as the general public are still in the analogue mindset in terms of cables. For analogue, you do get a difference in quality with the way the cable is twisted and constructed, however with HDMI, it's digital. The signal is either there or it isn't. Simple. In terms of picture quality, the only thing you should be concerned about is standards. HDMI 2.0 and 1.4 offer greater bandwidth over HDMI 1.3 which means you can fit more data down the cable thus increasing colour space and offers other advantages too.

Like someone else in the thread, I had the Ibra gold HDMI cables bought from Amazon, but had them replaced for the Fisual Hollywood Ultimate sereies (link here) simply because the Fisual cables are better built and lighter which makes them slot nicely into the back of my AV reciever; unlike the Ibra cables which sagged down potentially damaging the internal board.

On the topic of expensive cables, it does make me wonder whether anyone has actually genuinely bought cables like the Audioquest Diamond HDMI cable (link here) or the Carbon Fibre HDMI cable provided by Wireworld (link here). I mean, I'm a sucker for Carbon Fibre (I own CF spectacles and a CF pen) but that's just a little on the extreme side.

Fundamentally you have just said the better cables make a difference. I for one don't believe the digital signal is changed in any way just that with poor shielded cables (i.e. cheaper) the signal getts added to thus reducing the quility of the picture because you are looking at more than just the original signal which makes the blacks lighter and the picture sligtly fuzy.

I have and I haven't in a way. I do believe purchasing a better made product can improve the reliability of the cable over the cheap cables. But that goes with anything, the better made anything is, the more likely it is to last longer and the less prone it is to fail.This does not necessarily correlate to price though.

However the only thing that could increase picture quality between different cables are HDMI version standards. These ensure all cables within that category comply to the same specification. For example, all HDMI 1.0 cables whether they are £2 or £100 will only have a maximum data bandwidth of 4.9Gbps and will all perform the same. Similarly, all HDMI 2.0 cables will have a maximum data bandwidth of 18Gbps. The increase in bandwidth means it can support have a higher colour bit-depth. Therefore, cables produced to the HDMI 2.0 standard are able to produce better picture quality results in comparison to cables produced to the HDMI 1.0 specification.

For a full list on HDMI versions, please see here: http://www.audioholics.com/hdtv-formats/understanding-difference-hdmi-versions
 

professorhat

Well-known member
Dec 28, 2007
992
22
18,895
Visit site
To be fair, your post has the potential of causing more confusion by referring to HDMI 1.0 and 2.0 with regards to cables. These standards exist only for the ports and the devices themselves. Cables are rated either at Standard Speed or High Speed (with various Automotive and Ethernet variants that aren't really that important) - link.

It might not seem important, but there's enough confusion already with the dreadful way the HDMI spec have been drafted, with the potential for people to believe they need to upgrade a 1.3 cable to a 1.4 or 2.0 cable, when in fact any device using any of those three standards would use the same High Speed rated cable.
 

Ro-Tang Clan

New member
Oct 22, 2013
6
0
0
Visit site
professorhat said:
To be fair, your post has the potential of causing more confusion by referring to HDMI 1.0 and 2.0 with regards to cables. These standards exist only for the ports and the devices themselves. Cables are rated either at Standard Speed or High Speed (with various Automotive and Ethernet variants that aren't really that important) - link.

It might not seem important, but there's enough confusion already with the dreadful way the HDMI spec have been drafted, with the potential for people to believe they need to upgrade a 1.3 cable to a 1.4 or 2.0 cable, when in fact any device using any of those three standards would use the same High Speed rated cable.

Ahh I see, apologies, I did not mean to cause further confusion or provide false information. I was under the impression manufacturers produced the cables in correlation to a particular HDMI version standard but by law (for whatever reason) weren't allowed to use that as part of their marketing. That's how I thought cables were branded as 'standard' or 'high speed'.

I guess it was just something I formulated in my head simply because I have noticed a difference when switching between cables. I used to use the green HDMI cable that came with my Xbox and ran it between my AV reciever and TV (both ARC compatible) although ARC would never work. As soon as I upgraded to a cheap 'high speed' HDMI cable bought from Amazon ARC worked. I did also find it enabled a couple more colour options when using my Xbox thus providing better picture quality. Based on that, I made the assumption the green HDMI xbox cable was version 1.3 and the 'high speed' cable I had purchased was version 1.4. That;s where my theory is from anyway.
 

abacus

Well-known member
Ro-Tang Clan said:
professorhat said:
To be fair, your post has the potential of causing more confusion by referring to HDMI 1.0 and 2.0 with regards to cables. These standards exist only for the ports and the devices themselves. Cables are rated either at Standard Speed or High Speed (with various Automotive and Ethernet variants that aren't really that important) - link.

It might not seem important, but there's enough confusion already with the dreadful way the HDMI spec have been drafted, with the potential for people to believe they need to upgrade a 1.3 cable to a 1.4 or 2.0 cable, when in fact any device using any of those three standards would use the same High Speed rated cable.

Ahh I see, apologies, I did not mean to cause further confusion or provide false information. I was under the impression manufacturers produced the cables in correlation to a particular HDMI version standard but by law (for whatever reason) weren't allowed to use that as part of their marketing. That's how I thought cables were branded as 'standard' or 'high speed'.

I guess it was just something I formulated in my head simply because I have noticed a difference when switching between cables. I used to use the green HDMI cable that came with my Xbox and ran it between my AV reciever and TV (both ARC compatible) although ARC would never work. As soon as I upgraded to a cheap 'high speed' HDMI cable bought from Amazon ARC worked. I did also find it enabled a couple more colour options when using my Xbox thus providing better picture quality. Based on that, I made the assumption the green HDMI xbox cable was version 1.3 and the 'high speed' cable I had purchased was version 1.4. That;s where my theory is from anyway.

All you needto know about HDMI can be found here http://www.hdmi.org/index.aspx

Hope this helps

Bill
 

TRENDING THREADS

Latest posts