Originally Posted by saturnotaku
contains three links to other articles detailing how even a digital signal can degrade and under what circumstances.
Interesting read, but those tests mentioned in the articles were controlled by noone other than Monster. Those diagrams on the testing equipment could have represented just about anything.
Also the reviewer doesn't mention what kind of "cheap HDMI cable" was used to measure it up against the Monster one. Was it even certified? All we know is Monster handed it to him. Also, why didn't he take still shots of the supposedly worse vs better picture quality of Chicken Little viewed through the different cables?
If those articles were meant to convince me then they failed miserably. Of course this doesn't mean I am biased against Monster- it's just that I haven't been convinced yet. But if that happens I'll be among the first ones to make sure I grab a "high-end" cable instead of a cheap one.
I liked this comment by the way:
"I am a ASIC designer and a Verification engineer (chip design/verification if you don't know what ASIC stands for) and I can tell you that digital signals don't usually degrade over the distances most people use HDMI for. (under 25 ft)
I did HDMI receiver verification for a major telecom vendor last year, and I can tell you, the cable need not be of any higher quality than a USB cable! In fact, the packet structure is much simpler for HDMI than USB, with a good reason: HDMI carries image data, and not much control data (except for very short durations for HDCP, too miniscule to make a difference), and pixel errors in the data amount to one or two per frame at the most, and that happens rarely if at all, and at very long distances, and when it happens, I can guarantee you that you will not be able to tell, because it's just one/two pixel that immediately corrects in the next frame. At 30 frames per second, if you can spot that pixel in that one frame, you deserve a gold medal for having the eyes of a Superman."