Whilst you're technically right, I don't quite agree that an HDMI cable
always "either works or it doesn't". You can get an intermediate state, just as there was before with any other cable / interface scheme. In this case, it's individual pixels that drop out, rather than the full screen, giving a snow / sparkle effect (but the audio always drops completely if bits are missing, possibly because it's so offensive to introduce digital "noise" at full volume...)
This quality degradation is a bit more visually apparent with the HDMI standard than with previous consumer video standards, and if you're getting a full picture and full sound (no dropouts), chances are your cable is fine. You are supposedly far more likely to get a full failure (no signal at all) than this partial degradation, though; additionally, you probably should regard such degradation as a failure, too, even though it "sort of works".
It's important to consider that these dropouts are controlled by the receiver hardware and firmware / software, not the cable itself - i.e. if the signal degrades substantially in the cable, the receiver chip turns its outputs off or picks a default value instead.
That said, as long as the signal is getting through in its entirety (according to the "quality" and compatibility of the entire transmission chain), because it's digital,
in theory it will be the same no matter which cable you use. Which I'm guessing is what you meant.