Hi Ginder
I think we've been other this a few times now. I can't give you a full and detailed explanation, but the simple explanation that I understand is, the better the cable, the less signal loss, the less error handling and correction required, the better the picture. A digital signal is not as simple as just 0s and 1s being thrown down a wire. Yes, the basic building blocks of the signal is in binary format, but this is built into a larger framework which can be corrupted and is susceptible to loss. Corruption of this signal will lead to the error correction software built into the TV / amplifier having to "guess" what that data should have been judging from other data received which can therefore affect the quality of the picture.
Now there are people who will rubbish what I've just said above, and that's fair enough for them. All I know is, when I replaced my HDMI cable with a Chord 1.3 Silver Plus, I got a better picture. It wasn't a case of "Oh my goodness, look at the difference", but after swapping back and forth a few times, it was definitely "Yup, the Chord definitely has less noise onscreen and less blocking during fast action". For me, that was worth the extra £60 - £70 it cost me (can't remember now!). If it hadn't been though, I bought from a supplier with a money back guarantee so I would have just taken it back for a refund. This really is the only sensible solution to a highly contentious issue.