We also have to reailsed the even if dealing with 0 and 1 instead of signal modulation, there is still a frequency carrier involved which could still be distorded when low quality is used. Do not want this to be a big debate but I would like you and/or any other member to give me some deeper input/explanation justifying the fact that a cheaper cable is good enough for digital or HD signals.
I will expand on this topic since it isn't really covered in the links in the first post.
The debate over expensive cables for digital interconnect really boils down to a couple of simple questions, which are actually sort of related.
1. Are the digital bits (1's and 0's) recovered with error or without?
2. Is there clock distortion introduced by the cable?
The answer to #1 is pretty easy. Unless you are talking about very long runs of cable or a damaged cable the answer is almost always that the signals are received essentially error free. In reality no digital signal can ever be 100% error free but the corresponding bit error rate (BER) is low enough that it is for all intents and purposes zero.
The answer to #2 is a little more complicated. The way that S/PDIF connections work is that the digital stream sends both the clock and the data over a single connecition. They are encoded together at the transmitter in such a way that the receiver can separate the clock and data into their individual components. The received clock is often, not always, also used to drive the DAC converters in the receiver/pre-amp. If there is distortion in that clock it can cause audible effects.
So, for a digital cable to have an effect on the sound it would have to distort the signal sufficiently so that the recovered clock is actually distorted. Now it's important to remember that every S/PDIF receiver has a PLL for locking on to the received clock frequency. In most scenarios, even a distorted signal will still allow a stable clock to be recovered by the PLL. I am ignoring for now the case where the clock signal is still noisy enough that the data cannot be properly recovered since in #1 we assumed that the data was delivered without error.
Given this scenario you realise that it isn't really the cable that matters but the quality of the transmitters to generate a low distortion signal and the ability of the receivers to generate a high fidelity clock signal from the received source.
Also worth noting is that any clock distortion introduced by the cable are very easily handled by a simple buffer in the receiver and the use of a local reference clock.
So, if your cables are of reasonable length and you are hearing/seeing a difference, don't blame the cable, blame the electronics.