- 4,323
It doesn't matter what you say, should happen, it's what i see that matters. I am not delusional, what i see is there.
Assuming you are not some cable expert, most of your knowledge will be from what you have read on the internet.
I have a minor background in electrical engineering and over 2 decades working with computers. Suffice it so say I have been genearlly up to par on how the tech works from back when it was much simpler and have watched it grow over the years. This kind of experience gives you a level of insite. I also have familiarized myself with the white papers and tech specs of many kinds of transmition protocals and am comfrotable that I do actually understand (to a decent level) how HDMI works.
From what you have read, it may not sound possible what i describe, but there is probably many other factors, that you have yet to discover or learn other than the basic fundamentals.
I told you, I don't doubt you see what you see, it's either becuase you THINK it's true (as I said, those guys who swear one brand of CDR sounds better than another) or becuase you really do see it, but if you do, it's not becuase of why you think it is - Monster cables superior quality.
Digital information can degrade, although by different means to analogue.
JPG images degrade over time from constant copying, and they are a digital format.
Now I am going to venture you don't really know too much about this subject, becuase if you did and understood the basics of HDMI, you would realize the two really have nothing to do with each other in comparison.
JPG is a lossie compression format. That means data is inherently lossed every time it is recompressed.
However if you take a JPG file, and copy the file (ie you use windows explorer and copy it) you will not see degradation. It won't get blurry, it wont' get fuzzier, it's only saving it (and thus recompressing it) that degrades it.
Now here is an interesting fact... if you were to save the file a lot and during one of those saves, some bit got corrupted, when you opened it again, it woudl have degraded...
It wouldn't be blurry, it wouldn't be washed out... it wouldn't look like a different person/flower/whatever... it would either be all torn and corrupted, fail to load or have a dot or chunk just missing or all crazy wrong colors. This is a basic example (I think it's a mockup) of a corrupt digital image. This is what happens when a bit gets damaged or goes missing.

BTW that's not my picture and I am not responsible for the big red arrow or anything like that.
That's digital distortion for you. But that's as far as I am going to go with this as jpeg compression (btw there are lots of losless compression schemes) is a poor analogy for data transfer over an HDMI cable. They are so fundamentally different things that I would have to spend more time explaning the differences than drawing any similarities and even those similarities would have to carry a lot of disclaimers.
What i see, is obviously what is. What you tell me should happen is irrelevent, because it does not go with what i see.
Like I said, I don't doubt you see it. Why is what's in question.
Let's take the magic trick again, I SAW a woman get stabbed through the gut with a bunch of swords, I don't care if you tell me she can't live thorugh it, I saw it happen!
Ceratinly no one would really say that, but it's about on par with what you are saying.
Now let's say a magician walked up and said "no, that's not what happened, it can't be, she would have died. There is a trapdoor in there and the arms were stunt doubles and the curtain was carefully placed to cover the action"
Woudl you continue to say "I don't care what you say is possible, and I dont' are that you explained to me how it's really going on in there I SAW IT WITH MY OWN EYES!"?
Like i said, i'm sure there is more to this then you have learnt. I mean it affects cables of longer distances, so why not short.
Here again you show that you are making assumptions about things you don't understand.
It effects longer cables because of resistance, crosstalk and interference.
The longer the wire, the more resistance the elctricity has to overcome to traverse it and the more time the signal has to be degraded by interference. That's why thicker guage and better shielded cables can make a longer run than thinner less shielded cables.
It's not the technology that is distance dependant, it's the medium the data is going over.
Yes in a shorter cable most people probably won't notice it, but i do in comparison to my other cables. Digital can produce noise, and noise is what creates the fuzzy pixels. The image is ever so slightly clearer with a monster, though not really noticeable unless you look at the pixels up close to the screen.
That statement is too ambiguos and likely flawed for me to really comment on... but I can tell you that the monster cable does not make the pixels any better on your screen due to it's high quality.
Let's look again, 10000001 contiains all the info about this pixel, it's color, it's brightness and it's location down to the exact pixel on screen.
If that data makes it over to the display, you get exactly that. If it doesn't you get a sparkle where it's info was lost.
There is no way it can be better over one cable or another.
Again you are either imaginging the difference, or something is different in the spec of the cable you are comparing it to. Get a decent generic cable that matches spec to your monster cable, do a double blind test, you will not be able to tell the difference because there is no difference.
Chances are my older cables were 1.1 and 1.2 respectively. So obviously in comparison to them the monster will look better. Youa ll keep trying to say i see no difference.
Again, I didn't say you saw no difference, I said why you saw the difference isn't why you think you did.
Remember that gold plated optical cable I mentioned? Remember the reviews that said it sounded much better?
Well we know gold plating can't make an optical cable better... optical is light based, the only reasons for gold are corrosion resistance and better electronic conductivity so there is no way the gold optical cable was better than a regular optical cable.
Yet the reviews said it was better... better than their old RCA cables carrying 2 channel dolby surround.
Well of course it's better! But does that mean gold plating an optical cable makes it better? No.
So if you switch from crappy 1.1 hdmi cables to monser 1.3 cables with 1.3 capable devices, and it looks better, does that mean monster cables are better? No... it means 1.3 looks better.
That's what I am saying, sure you might see a difference, but it's not becuase of why you are saying it is, it has nothing to do with monster, it has nothign to do with their superior qulity parts, it has to do with the spec.
]When the monster cable was first introduced it could probably meet 1.3 before 1.3 was even a standard. My monster might even meet the requirements for a future 1.4. In which case the cable was worth it.
Sure, and those monoprice cables might as well... so no it's not worth it in that case. It's only worth it if it actually has something that gives you a real life increase in performance. And as of yet, it doesn't.
And even if that did come up, how many generic cables could you buy as updates before spendign as much on the monster? Let's say that a generic cable costs $5 and you spent $30 on the monster.
Well if you need a new cable for every 1.x then you could get to HDMI 1.9 before you broke even on the monster cable.
Bear in mind I don't think anything even uses deep color yet and every time a new standard comes out it takes years to become standard and really get use... so HDMI 1.9... you are looking way out there. Most likely by then we will not even be using HDMI anymore... and if that happens, your monster cable wasn't even a fair buy. If you have to toss it and upgrade around 1.7 you will have lost money.
The weired thing is though that the current gen PS3 is only HDMI 1.2 capable, so if my second mid range cable was 1.2 by your theory i shouldn't see a difference, but i did. I know my first crap cable from Game was hdmi 1.1, so for the second cable to appear better, which it did, it would have to be 1.2, to which the PS3 is only capable to support. So therefore the moster and the second mid range cable i bought should look the same on PS3, but the thing is this isn't the case.
You have a lot of ifs and ands in there... your better off nailing some details down before you start making assumptions and drawing conclusions like that.
Best to run a double blind test too... it's been proven that just spending money on something can make you perceive the result to be better.
[/quote]Look at this, he doesn't mention colour degradtion or any of the improvements i saw, but he does explain why there is more to this than meets the eye.
Doesn't seem to be showing, heres the link
http://www.youtube.com/watch?v=1-zbIBERGk4&feature=related
Well here you are turning right back to the source for more marketing. Two thigns: 1 notice he mentions sparkles or dropouts... those are the two kinds of degradation you can get with HDMI. But when he says quality, he is using a purposefully vague world becuase you will then assume anything could be quality related. Hue and color temp are not the kind of quality he is talking about.
I see above you mention that different parts of the signal measure different things... taht's true, and those individual parts might get messed with... but they will get messed with in a unique way for each piece of data. That's why huge changes and image tint can't be effected. One big might get huge chainged, the next brightness, the next totally dropped etc. With digital you dont' get the case where the 6th bit in every bite is pushed up by 1 thus every color gets a little darker... you get random changes and as I said before, the difference between 11001010 and 10001010 is probably drastic. That's not to colors next to each other, thats not to shades... that's like bright white vs dark purple or maybe it's not even a valid color...
This guy is actually telling the truth. And again he is carefully telling the truth.
What he says is correct about analog medium for digital signals. And why they degrade more over distance is also true (it's at least part of the reason).
What he fails to mention is, how does quality play into this? To be precise, how does quality prevent the degradation of the signal, how much degradation is acceptable, and what happens if it degrades too far?
Here are the answers:
1: Better quality with less impurities, better shielding and lower resistance allows a signal to travel farther with less degradation. A copper wire with a lot of oxygen bubbles, impurities and problems will provide more resistance and the signal will be damaged more.
2: This is the key part. How much degradation is acceptable? That's where the eye test comes into play and where the digital world seperates itself from the analog.
In an analog signal, every bit of degradation results in degradation on the receiving end. Whatever fuz, noise and resistance gets into the signal comes out the other end.
With digital, degradation does not work the same. As long as the signal stays above a certain level of accuracy, the result is reproduced perfectly. However once it hits that digital cliff, the information is completely lost. There is no middle ground.
So as long as a digital signal stays within a certain range it will come out just fine, the degradation is effectively removed at the receiving end.
Here is a really horrible example:

Note on the analog at the top, the signal degrades to the right and by the time you get to the end, it's not very similar to at the beginning. If this was a sound wave, it would be much quieter and loose a lot of fidelity.
Look at the digital one on the bottom. It has lost a lot by the end also. But look at the red lines, those are the cutoffs for the voltage to represent a 1 or a 0. Above it's a 1, below it's a 0.
Notice something? Despite being degraded, I can still pick out every 1 and every 0 properly. So despite the signal being degraded, once I have read the 1's an 0's back, I have a perfect copy of the data sent over.
The diagram and explanation are pretty poor, but maybe it gets the point across?
Which leads to 3...
How much is acceptable? And here is why monster is not worth the extra $$$...
Because with digital... as long as you stay outside the lines, the result is the same! You only have to be that good, if you get way better, the result is no different.
Here is another crappy example:

Monster on top shows it clearly held up better... maybe this is because it's 0 oxygen pure copper... maybe it's hand polished... maybe it's any of the other marketing crap monster throws around.
Notice something? Every 1 and 0 is clearly readable at the receiving end.
Now look at the generic cable below. The signal definitely suffered... it's not as pretty as the monster cable... but guess what.. I can stil read every 1 and every 0! (ignore the two waves are not the same data, that was hand drawn in paint, can't be bothered to duplicate the waves perfectly).
So result? If I feed 10000001 into the monster cable, I get out 10000001.
If I feed 10000001 into the cheaper cable, I get out 10000001.
The exact same result! The ones are not taller, the 0s are not rounder... it's the same result!
This is the point... Is monster better? Sure... is it much better? Who knows... is it better enough to make a difference? No! As long as a cable passes certification for a standard and reliably passes data at the levels needed, getting better does not help at all! You get the same 1's and the same 0's!
That's why monster is a big rippoff... you pay for better, and you probably do get better... but it's not better you can use!
If you pay for a car that can go 300mph and it does go 300mph, yout got what you paid for. But if you only drive freeways and max out at 65... you didn't need to spend the money.
Monster is the car dealer telling you that the 300mph car is what you need because it's better than the 60mph car... it's true it's better... but it's not what you need and that's why you get ripped off
Did you not here him say how sound can be of ''reduced quality'', the same applys to image data. It's no different when it's in the cable as far as physics are concerned.
Sound can be degraded by the same "sparkles" as video. You know that no matter how many times you copy a CD in your computer, it sounds the exact same right? Never degrades...
But what about a badly scratched CD... sometimes they make really nasty high pitched blips or squeals... know what those are? Those are the audio equivalent of sparkles. Data was lost and the result was something way off.
You will notice when a CD gets scratched and data gets lost, the guys voice doesn't get quieter, he doesn't sing muffled, it's not like you can't hear just the horn now... it's a crazy weird blip. That's analagos to digital images also... the blip shows up as a sparkle, and for the same reason that a scratched CD does not result in a voice dropping or getting muffled like an analog tape would, digital images do not get hue and color shift.
You are making the same mistake here you made with your lag assumption in the original post... you saw lag, you know you did and without all the info you drew your conclusion as to why. I am still betting that it's the input on your monitor. Get that DVI/HDMI adapter, run your computer at the same res as the PS3 game (this is necessary to make sure the same scaler kicks in and works just as hard) and I bet you will see the same lag.
Last edited: