One thing to understand is that DVI/HDMI was brought in with the ability to provide copyright protection which was difficult to achieve in component video. Thus the implementation of HDMI was not firmly based with the consumer in mind, component video can be used to provide high definition video, HD is not exclusive to a digital signal. It could be argued that the HDMI std did not pay too much attention to the problems of sending a high frequency signal along twisted pair cables, ie copyright protection was more important.
It is correct that a digital signal is a stream of 1s and 0s and it is also true that if you transmit a digital signal and the same stream of 1s and 0s is decoded at the receiver then you have no loss, in effect a lossless transmission. But a 1 and zero is actually represented by a voltage level and obviously these levels need to change and this change takes time, it cannot occur instantaneously. In a perfect word these voltage states of 1 and 0 would be represented by a perfect square wave, but for a variety of reasons the wave is never perfect square (harmonics, infinite bandwidth required and noise considerations).
This change in state is handled by circuits which have capacitance and impedance and when you add in a cable you also change the capacitance and impedance of the circuit. Adding capacitance and impedance to the circuit has the effect of delaying the time it takes to change state, (t=1/CR). Further, impedance mismatches and crosstalk between the twisted pairs adds random noise to the signal.
What this means is that our square wave at the transmitter looks a loss less square when it is received, if you viewed this on an oscilloscope you would see a definite difference in the shape of the transmitted to the original. Now this can be handled at the receiving circuit as long as the signal can be sampled and reconstructed to the original. This sampling in HDMI is at a very high frequency at has to occur at a certain instant on the wave, ie if the sample occurs on a part of the wave that has not reached a certain threshold then you have will have errors in the signal.
In simple terms if you kept making the cable longer you would end up with a signal that was just noise and contained no data, therefore the quality of the cable has to play a role in the transmission of the original signal. Now I don't have experience of HDMI but I do have 20 years of experience of transmitting signals from ROVs (vehicles on the seabed) and these include some fairly high frequency signals. What I can tell you from that experience is that there are a lot of factors that dictate the noise on a signal and the quality of the cable plays a huge role.
In this thread it has been mentioned that cheap cables have been tested against more expensive. One thing to bear in mind is just because it is more expensive does not necessarily make it better, it comes down to the spec, but one thing that is certain is that a cable that is better designed and built will have less noise at the end of it. Also when testing cables the data being transmitted is vital, a close up scene with little motion and little change in colour will not test a cable as much as a scene with lots of colour changes and lots of fast motion. Further it is not just down to the cable, the size of the display, how good the source and display decoders/encoders are also plays a big part, if these are not great in the 1st place then a high quality cable cannot wave a magic wand.
Buying an expensive cable does not necessarily guarantee you will get a better picture than a cable a quarter of the price, as stated there are many factors that influence the quality. But would I spend a lot of money on equipment and then buy cheap cables, not a chance, just like if I owned a Ferrari I would not put on cheap tyres.