Input lag

It doesn't matter what you say, should happen, it's what i see that matters. I am not delusional, what i see is there.

Assuming you are not some cable expert, most of your knowledge will be from what you have read on the internet.

I have a minor background in electrical engineering and over 2 decades working with computers. Suffice it so say I have been genearlly up to par on how the tech works from back when it was much simpler and have watched it grow over the years. This kind of experience gives you a level of insite. I also have familiarized myself with the white papers and tech specs of many kinds of transmition protocals and am comfrotable that I do actually understand (to a decent level) how HDMI works.

From what you have read, it may not sound possible what i describe, but there is probably many other factors, that you have yet to discover or learn other than the basic fundamentals.

I told you, I don't doubt you see what you see, it's either becuase you THINK it's true (as I said, those guys who swear one brand of CDR sounds better than another) or becuase you really do see it, but if you do, it's not becuase of why you think it is - Monster cables superior quality.

Digital information can degrade, although by different means to analogue.
JPG images degrade over time from constant copying, and they are a digital format.

Now I am going to venture you don't really know too much about this subject, becuase if you did and understood the basics of HDMI, you would realize the two really have nothing to do with each other in comparison.

JPG is a lossie compression format. That means data is inherently lossed every time it is recompressed.

However if you take a JPG file, and copy the file (ie you use windows explorer and copy it) you will not see degradation. It won't get blurry, it wont' get fuzzier, it's only saving it (and thus recompressing it) that degrades it.

Now here is an interesting fact... if you were to save the file a lot and during one of those saves, some bit got corrupted, when you opened it again, it woudl have degraded...

It wouldn't be blurry, it wouldn't be washed out... it wouldn't look like a different person/flower/whatever... it would either be all torn and corrupted, fail to load or have a dot or chunk just missing or all crazy wrong colors. This is a basic example (I think it's a mockup) of a corrupt digital image. This is what happens when a bit gets damaged or goes missing.

cpbild2jq8.jpg


BTW that's not my picture and I am not responsible for the big red arrow or anything like that.

That's digital distortion for you. But that's as far as I am going to go with this as jpeg compression (btw there are lots of losless compression schemes) is a poor analogy for data transfer over an HDMI cable. They are so fundamentally different things that I would have to spend more time explaning the differences than drawing any similarities and even those similarities would have to carry a lot of disclaimers.

What i see, is obviously what is. What you tell me should happen is irrelevent, because it does not go with what i see.

Like I said, I don't doubt you see it. Why is what's in question.

Let's take the magic trick again, I SAW a woman get stabbed through the gut with a bunch of swords, I don't care if you tell me she can't live thorugh it, I saw it happen!

Ceratinly no one would really say that, but it's about on par with what you are saying.

Now let's say a magician walked up and said "no, that's not what happened, it can't be, she would have died. There is a trapdoor in there and the arms were stunt doubles and the curtain was carefully placed to cover the action"

Woudl you continue to say "I don't care what you say is possible, and I dont' are that you explained to me how it's really going on in there I SAW IT WITH MY OWN EYES!"?

Like i said, i'm sure there is more to this then you have learnt. I mean it affects cables of longer distances, so why not short.

Here again you show that you are making assumptions about things you don't understand.

It effects longer cables because of resistance, crosstalk and interference.

The longer the wire, the more resistance the elctricity has to overcome to traverse it and the more time the signal has to be degraded by interference. That's why thicker guage and better shielded cables can make a longer run than thinner less shielded cables.

It's not the technology that is distance dependant, it's the medium the data is going over.

Yes in a shorter cable most people probably won't notice it, but i do in comparison to my other cables. Digital can produce noise, and noise is what creates the fuzzy pixels. The image is ever so slightly clearer with a monster, though not really noticeable unless you look at the pixels up close to the screen.

That statement is too ambiguos and likely flawed for me to really comment on... but I can tell you that the monster cable does not make the pixels any better on your screen due to it's high quality.

Let's look again, 10000001 contiains all the info about this pixel, it's color, it's brightness and it's location down to the exact pixel on screen.

If that data makes it over to the display, you get exactly that. If it doesn't you get a sparkle where it's info was lost.

There is no way it can be better over one cable or another.

Again you are either imaginging the difference, or something is different in the spec of the cable you are comparing it to. Get a decent generic cable that matches spec to your monster cable, do a double blind test, you will not be able to tell the difference because there is no difference.

Chances are my older cables were 1.1 and 1.2 respectively. So obviously in comparison to them the monster will look better. Youa ll keep trying to say i see no difference.

Again, I didn't say you saw no difference, I said why you saw the difference isn't why you think you did.

Remember that gold plated optical cable I mentioned? Remember the reviews that said it sounded much better?

Well we know gold plating can't make an optical cable better... optical is light based, the only reasons for gold are corrosion resistance and better electronic conductivity so there is no way the gold optical cable was better than a regular optical cable.

Yet the reviews said it was better... better than their old RCA cables carrying 2 channel dolby surround.

Well of course it's better! But does that mean gold plating an optical cable makes it better? No.

So if you switch from crappy 1.1 hdmi cables to monser 1.3 cables with 1.3 capable devices, and it looks better, does that mean monster cables are better? No... it means 1.3 looks better.

That's what I am saying, sure you might see a difference, but it's not becuase of why you are saying it is, it has nothing to do with monster, it has nothign to do with their superior qulity parts, it has to do with the spec.

When the monster cable was first introduced it could probably meet 1.3 before 1.3 was even a standard. My monster might even meet the requirements for a future 1.4. In which case the cable was worth it.
]

Sure, and those monoprice cables might as well... so no it's not worth it in that case. It's only worth it if it actually has something that gives you a real life increase in performance. And as of yet, it doesn't.

And even if that did come up, how many generic cables could you buy as updates before spendign as much on the monster? Let's say that a generic cable costs $5 and you spent $30 on the monster.

Well if you need a new cable for every 1.x then you could get to HDMI 1.9 before you broke even on the monster cable.

Bear in mind I don't think anything even uses deep color yet and every time a new standard comes out it takes years to become standard and really get use... so HDMI 1.9... you are looking way out there. Most likely by then we will not even be using HDMI anymore... and if that happens, your monster cable wasn't even a fair buy. If you have to toss it and upgrade around 1.7 you will have lost money.

The weired thing is though that the current gen PS3 is only HDMI 1.2 capable, so if my second mid range cable was 1.2 by your theory i shouldn't see a difference, but i did. I know my first crap cable from Game was hdmi 1.1, so for the second cable to appear better, which it did, it would have to be 1.2, to which the PS3 is only capable to support. So therefore the moster and the second mid range cable i bought should look the same on PS3, but the thing is this isn't the case.

You have a lot of ifs and ands in there... your better off nailing some details down before you start making assumptions and drawing conclusions like that.
Best to run a double blind test too... it's been proven that just spending money on something can make you perceive the result to be better.

Look at this, he doesn't mention colour degradtion or any of the improvements i saw, but he does explain why there is more to this than meets the eye.

Doesn't seem to be showing, heres the link

http://www.youtube.com/watch?v=1-zbIBERGk4&feature=related
[/quote]

Well here you are turning right back to the source for more marketing. Two thigns: 1 notice he mentions sparkles or dropouts... those are the two kinds of degradation you can get with HDMI. But when he says quality, he is using a purposefully vague world becuase you will then assume anything could be quality related. Hue and color temp are not the kind of quality he is talking about.

I see above you mention that different parts of the signal measure different things... taht's true, and those individual parts might get messed with... but they will get messed with in a unique way for each piece of data. That's why huge changes and image tint can't be effected. One big might get huge chainged, the next brightness, the next totally dropped etc. With digital you dont' get the case where the 6th bit in every bite is pushed up by 1 thus every color gets a little darker... you get random changes and as I said before, the difference between 11001010 and 10001010 is probably drastic. That's not to colors next to each other, thats not to shades... that's like bright white vs dark purple or maybe it's not even a valid color...

This guy is actually telling the truth. And again he is carefully telling the truth.

What he says is correct about analog medium for digital signals. And why they degrade more over distance is also true (it's at least part of the reason).

What he fails to mention is, how does quality play into this? To be precise, how does quality prevent the degradation of the signal, how much degradation is acceptable, and what happens if it degrades too far?

Here are the answers:

1: Better quality with less impurities, better shielding and lower resistance allows a signal to travel farther with less degradation. A copper wire with a lot of oxygen bubbles, impurities and problems will provide more resistance and the signal will be damaged more.

2: This is the key part. How much degradation is acceptable? That's where the eye test comes into play and where the digital world seperates itself from the analog.

In an analog signal, every bit of degradation results in degradation on the receiving end. Whatever fuz, noise and resistance gets into the signal comes out the other end.

With digital, degradation does not work the same. As long as the signal stays above a certain level of accuracy, the result is reproduced perfectly. However once it hits that digital cliff, the information is completely lost. There is no middle ground.

So as long as a digital signal stays within a certain range it will come out just fine, the degradation is effectively removed at the receiving end.

Here is a really horrible example:

untitled.jpg


Note on the analog at the top, the signal degrades to the right and by the time you get to the end, it's not very similar to at the beginning. If this was a sound wave, it would be much quieter and loose a lot of fidelity.

Look at the digital one on the bottom. It has lost a lot by the end also. But look at the red lines, those are the cutoffs for the voltage to represent a 1 or a 0. Above it's a 1, below it's a 0.

Notice something? Despite being degraded, I can still pick out every 1 and every 0 properly. So despite the signal being degraded, once I have read the 1's an 0's back, I have a perfect copy of the data sent over.

The diagram and explanation are pretty poor, but maybe it gets the point across?

Which leads to 3...

How much is acceptable? And here is why monster is not worth the extra $$$...

Because with digital... as long as you stay outside the lines, the result is the same! You only have to be that good, if you get way better, the result is no different.

Here is another crappy example:

untitled2.jpg


Monster on top shows it clearly held up better... maybe this is because it's 0 oxygen pure copper... maybe it's hand polished... maybe it's any of the other marketing crap monster throws around.

Notice something? Every 1 and 0 is clearly readable at the receiving end.

Now look at the generic cable below. The signal definitely suffered... it's not as pretty as the monster cable... but guess what.. I can stil read every 1 and every 0! (ignore the two waves are not the same data, that was hand drawn in paint, can't be bothered to duplicate the waves perfectly).

So result? If I feed 10000001 into the monster cable, I get out 10000001.

If I feed 10000001 into the cheaper cable, I get out 10000001.

The exact same result! The ones are not taller, the 0s are not rounder... it's the same result!

This is the point... Is monster better? Sure... is it much better? Who knows... is it better enough to make a difference? No! As long as a cable passes certification for a standard and reliably passes data at the levels needed, getting better does not help at all! You get the same 1's and the same 0's!

That's why monster is a big rippoff... you pay for better, and you probably do get better... but it's not better you can use!

If you pay for a car that can go 300mph and it does go 300mph, yout got what you paid for. But if you only drive freeways and max out at 65... you didn't need to spend the money.

Monster is the car dealer telling you that the 300mph car is what you need because it's better than the 60mph car... it's true it's better... but it's not what you need and that's why you get ripped off :)


Did you not here him say how sound can be of ''reduced quality'', the same applys to image data. It's no different when it's in the cable as far as physics are concerned.

Sound can be degraded by the same "sparkles" as video. You know that no matter how many times you copy a CD in your computer, it sounds the exact same right? Never degrades...

But what about a badly scratched CD... sometimes they make really nasty high pitched blips or squeals... know what those are? Those are the audio equivalent of sparkles. Data was lost and the result was something way off.

You will notice when a CD gets scratched and data gets lost, the guys voice doesn't get quieter, he doesn't sing muffled, it's not like you can't hear just the horn now... it's a crazy weird blip. That's analagos to digital images also... the blip shows up as a sparkle, and for the same reason that a scratched CD does not result in a voice dropping or getting muffled like an analog tape would, digital images do not get hue and color shift.

You are making the same mistake here you made with your lag assumption in the original post... you saw lag, you know you did and without all the info you drew your conclusion as to why. I am still betting that it's the input on your monitor. Get that DVI/HDMI adapter, run your computer at the same res as the PS3 game (this is necessary to make sure the same scaler kicks in and works just as hard) and I bet you will see the same lag.
 
Last edited:
Dude, did you listen to the guy. The main cause for this degradation is not neccesarily because the signal may be unreadable at the other end, but because the various sections of data can get out of sync. I understand completly how signals work, i did A level physics and, electronics at GCSE.

Say it takes a number of 1s and 0s to control the hue of 1 pixel, if these components of data are out of sync at the other end then, they therefore will not be in the correct proportion at any one instant at the other end. Hence the hue will be slightly different to what was initially transmitted.

Anyway, if you don't believe it's the cable quality that is making the difference and i'm pretty sure my midrange cable and i know my monster are both at least 1.2(maximum PS3 capable). Then what do you suggest is the cause. The control here is that only the cable is being swapped, nothing else, therefore it can only be the cable that is the cause.
 
Dude, did you listen to the guy. The main cause for this degradation is not neccesarily because the signal may be unreadable at the other end, but because the various sections of data can get out of sync. I understand completly how signals work, i did A level physics and, electronics at GCSE.

Say it takes a number of 1s and 0s to control the hue of 1 pixel, if these components of data are out of sync at the other end then, they therefore will not be in the correct proportion at any one instant at the other end. Hence the hue will be slightly different to what was initially transmitted.

OK before I go on, after doing all the work in the last post, I am going to ask you really read it over until you understand what I am saying. At this point you have just read it enough to pick up some diffrence and point it out.

Let's suffice it to say, you need to understand dgital data transmission better than "it takes some number of 1's and 0's to transmit the hue of a pixel". That's drastically simplified, and by the way you talk about it, it shows you do not understand how binary bytes are used to control the value of something.

If so, please explain in brief. I would explain it to you, but considering how much I have done, I think it's ony fair you carry your half of the conversation here. Quickly explain binary expressino of decimal values, how they are used to control pixel value and touch on stop bits and how they would effect errors in transmission in this case.

Anyway, if you don't believe it's the cable quality that is making the difference and i'm pretty sure my midrange cable and i know my monster are both at least 1.2(maximum PS3 capable). Then what do you suggest is the cause. The control here is that only the cable is being swapped, nothing else, therefore it can only be the cable that is the cause.

I can't say... there are a lot of potential variables... but what I suggest is what I have always suggested a true double blind test between the monster and a known decent cable. I would say any cable that does not produce sparkles or dropouts is a decent cable, make sure it's rated to whatever HDMI level the monster one is (or at least the source and receiver are).

Make sure its' truly double blind, do not understimate the subconcious abilities of yourself or whoever is helping you do your double blind test.

Look back a few posts, I detailed exactly how to do a double blind test. It must be repeated multiple times in a controlled environment.

Here's something else to do... go to avsoforum... ask the same question... despite the general hatred of monster there, some very smart people will undboutably take the time to set you straight... you can trust them, or you can trust the guys selling you the ridiculously overpriced cable and people on the internet who say their HDMI is ghosting, but it turns out to always be something besides the cable.
 
BTW I just rewatched that video, two things:

When he says audio video dropouts quality, he is totally fudging it and just looking for words to get the idea across. When he says quality he either means dropouts and sparkles as quality, or he is just lying or wrong and means colors and hues. I garauntee you he means qulaity issues such as sparkles or dropouts.

Also I never see him mention anything getting out of sync, in fact he doesn't really make sense... he says due to copper having resistance, not all the 1's and 0's get there at the same time... they aren't supposed to... they get there in a data stream... I really have no idea what he means by all getting there at once... it doesn't even make sense yet he keeps saying it.

So taht part doesn't even make sense, the rest of his explanatino is based on the degradation of the signal (which is what an eye test checks) and that is what I addressed above with my crappy paint drawings.
 
OK before I go on, after doing all the work in the last post, I am going to ask you really read it over until you understand what I am saying. At this point you have just read it enough to pick up some diffrence and point it out.

Let's suffice it to say, you need to understand dgital data transmission better than "it takes some number of 1's and 0's to transmit the hue of a pixel". That's drastically simplified, and by the way you talk about it, it shows you do not understand how binary bytes are used to control the value of something.

If so, please explain in brief. I would explain it to you, but considering how much I have done, I think it's ony fair you carry your half of the conversation here. Quickly explain binary expressino of decimal values, how they are used to control pixel value and touch on stop bits and how they would effect errors in transmission in this case.



I can't say... there are a lot of potential variables... but what I suggest is what I have always suggested a true double blind test between the monster and a known decent cable. I would say any cable that does not produce sparkles or dropouts is a decent cable, make sure it's rated to whatever HDMI level the monster one is (or at least the source and receiver are).

Make sure its' truly double blind, do not understimate the subconcious abilities of yourself or whoever is helping you do your double blind test.

Look back a few posts, I detailed exactly how to do a double blind test. It must be repeated multiple times in a controlled environment.

Here's something else to do... go to avsoforum... ask the same question... despite the general hatred of monster there, some very smart people will undboutably take the time to set you straight... you can trust them, or you can trust the guys selling you the ridiculously overpriced cable and people on the internet who say their HDMI is ghosting, but it turns out to always be something besides the cable.


You are asking me to delve deeper than i have learnt. All i was trying to point out was that it's not just as little amount of data for each pixel than you initally suggested. Hence it's not an all or nothing game, you before suggested the pixel will either be there or it won't. Well if there is a number of constituting factors to the outcome of a pixel then surely it can be different at the other end.

At the end of the day, the technical jargon is of no interest to me, i don't need to do the blind test. I switched the cable back and forth a number of times to make sure, and the monster definately had better black levels, a warmer more vibrant colour, and less noise, grainyness, though the latter was very subtle.
 
BTW I just rewatched that video, two things:

When he says audio video dropouts quality, he is totally fudging it and just looking for words to get the idea across. When he says quality he either means dropouts and sparkles as quality, or he is just lying or wrong and means colors and hues. I garauntee you he means qulaity issues such as sparkles or dropouts.

Also I never see him mention anything getting out of sync, in fact he doesn't really make sense... he says due to copper having resistance, not all the 1's and 0's get there at the same time... they aren't supposed to... they get there in a data stream... I really have no idea what he means by all getting there at once... it doesn't even make sense yet he keeps saying it.

So taht part doesn't even make sense, the rest of his explanatino is based on the degradation of the signal (which is what an eye test checks) and that is what I addressed above with my crappy paint drawings.

There is more than one transmitting wire in the cable, that's why. Notice why there is more than 1 pin on the connector obviously.
 
You are asking me to delve deeper than i have learnt. All i was trying to point out was that it's not just as little amount of data for each pixel than you initally suggested. Hence it's not an all or nothing game, you before suggested the pixel will either be there or it won't. Well if there is a number of constituting factors to the outcome of a pixel then surely it can be different at the other end.

Well here's the problem, if you won't be bothered to understand the fundamentals, I can't really explain it to you without teaching them all to you anyway. And what will end up happening (and is happening) is that you are not actually bothering to learn anything, you are just going on what you know (which isn't enough) and making it difficult or impossible to explain this to you becuase you can't explain it to someone who doesn't understand.

I will quickly say, you are correct, it's not an all or nothing in terms of how many 1's an 0's arrive, but in the world of computer language, if you don't get all your bits, you can't construct your byte. If you get 7 bits and need 8 to make a byte, and you just got your stop bit... well you are screwed, without error correctino and a way to request a resend, you are just going to have to toss tht piece of info.

That's why people say it's all or nothing.

As for the getting out of sync... I can't explain that away for you becuse it makes no sense... the 1's and 0's arrive in order in a stream.. they don't get all bunched up, they don't arrive in clumps...

When he is talking about resisitance and slowing down he is talking about time it takes a voltage to change moving over copper. When you remove voltage from a wire, it doesn't immediately go to 0, over a very short period of time it drops. An analog medium cannot jump from one charge state to another instantly, it must make the move in a continuos line.

At the end of the day, the technical jargon is of no interest to me, i don't need to do the blind test. I switched the cable back and forth a number of times to make sure, and the monster definately had better black levels, a warmer more vibrant colour, and less noise, grainyness, though the latter was very subtle.

If you won't be bothered to research your cables better (just assuming some things about them isn't good enough) and you won't be bothered to do a double blind test, then there really isn't much more to do about it... it just shows you aren't interested in being objective, you are interested in proving to yourself what you already know to be true... even if it's not :)

You seem like a good guy, someone who enjoys figuring stuff out and learning about it... you would do well to educate yourself more on the things you get into as I think having a strong foundation would lead you to a much more satisfying set of results!
 
There is more than one transmitting wire in the cable, that's why. Notice why there is more than 1 pin on the connector obviously.

Again you are making assumptions on what you see and not what you know... this will continue to get you introuble.

HDMI cables use 3 pairs of copper to send the audio and video data. The rest of the pins are for additional features outside video and audio (such as grounds, detecting that it's plugged in and a sync clock channel)

1 pair is used for audio.

2 pairs are used for video.

The 2 video pairs send inverse signals for the purpose of overcoming signal degradation. The end result is only one data stream, but it's being sent two ways.
 
I only wish there was some way of me proving this to you via an image comparison. However i don't think even my digital SLR would produce a noticeable result when put on the net, and transmitted through your screen.

BTW, the cables i used in the follwing order, improved in quality each time.

http://www.game.co.uk/PS3/Hardware/Accessory/~r330441/GAMEware-PlayStation-3-HDMI-Cable/

http://www.ross-style.com/performance-interconnects/hdmi-cable-2m

http://www.monstercable.com/productdisplay.asp?pin=3847
 
Last edited:
I only wish there was some way of me proving this to you via an image comparison. However i don't think even my digital SLR would produce a noticeable result when put on the net, and transmitted through your screen.

There is no way a picture would do it... I would have to see it in person and I am not making the the flight to sort this one out (although if I got paid by the word to post I might have paid for it already! :D)

Look, let's lay a few things out here...

You seem to know a fair bit about electronics... are you computer savy? At least a little? Do you know about RAM and how it works? I hope so...

Have you ever come across someone who says something like "I want to put my vacation photos on my comptuer but I only have 2Ghz of RAM".

You say, "You mean GB and RAM has nothing to do with how many pictures you can store on your computer".

To which they say "Well my friend has one of those Apple computers with 4Ghz of RAM and he is always putting more pictures on his, I am pretty sure that's it becuase I use his Apple all the time... maybe that's it... maybe I need to buy an Apple so I can fit my pictures on it!"

This person has seen just enough and knows just enough to get himself in trouble. He is relying on empirical data to draw assumptions, and those assumptions seem correct to him, because he doesn't have the knowledge to realize what the alternative reasons might be.

This is where we are right now... I dont mean this in an insulting way, but I know a lot more about the subject at hand than you do. That's just saying it how it is.

I am telling you what is possible and what's not. You are telling me what is happening.

I don't think you are lying.

But I think you are wrong about something, and taht something is either whether it really is or why it is.

But until you do the equivalent of learning that RAM comes in GB and not Ghz and where images are actually stored on a computer and how a hard drive plays a role in your data storage...and until you stop citing things pamphlets you saw at the apple store and what the guy in the coffee shop told you about apple computers to prove me wrong... well I can't really explain it to you... and by then, I won't have to explain it to you :)

So rather than keep arguing with me here... why not do some research on the net, maybe between games of GT5P when you are getting mad at that lag :)

Read up a bit, learn it, and I bet in short of a week you will have it all figured out and see the error of your ways.

I just hope that what you come up with is that your image is really better but just not why you thought, becuase it would suck to educate yourself so much you realize you were just imaginging it's better :D
 
Thats a really nice discussion. :lol:
I mean, i knew all this stuff already, but reading through it in such a detail is cool.
Thanks Devedander. 👍
 
Thats a really nice discussion. :lol:
I mean, i knew all this stuff already, but reading through it in such a detail is cool.
Thanks Devedander. 👍

Glad to see someone liked it... I hate to think all these hours I have spent becoming pale and more nerdy in front of the computer has been a plus for someone :D
 
There is no way a picture would do it... I would have to see it in person and I am not making the the flight to sort this one out (although if I got paid by the word to post I might have paid for it already! :D)

Look, let's lay a few things out here...

You seem to know a fair bit about electronics... are you computer savy? At least a little? Do you know about RAM and how it works? I hope so...

Have you ever come across someone who says something like "I want to put my vacation photos on my comptuer but I only have 2Ghz of RAM".

You say, "You mean GB and RAM has nothing to do with how many pictures you can store on your computer".

To which they say "Well my friend has one of those Apple computers with 4Ghz of RAM and he is always putting more pictures on his, I am pretty sure that's it becuase I use his Apple all the time... maybe that's it... maybe I need to buy an Apple so I can fit my pictures on it!"

This person has seen just enough and knows just enough to get himself in trouble. He is relying on empirical data to draw assumptions, and those assumptions seem correct to him, because he doesn't have the knowledge to realize what the alternative reasons might be.

This is where we are right now... I dont mean this in an insulting way, but I know a lot more about the subject at hand than you do. That's just saying it how it is.

I am telling you what is possible and what's not. You are telling me what is happening.

I don't think you are lying.

But I think you are wrong about something, and taht something is either whether it really is or why it is.

But until you do the equivalent of learning that RAM comes in GB and not Ghz and where images are actually stored on a computer and how a hard drive plays a role in your data storage...and until you stop citing things pamphlets you saw at the apple store and what the guy in the coffee shop told you about apple computers to prove me wrong... well I can't really explain it to you... and by then, I won't have to explain it to you :)

So rather than keep arguing with me here... why not do some research on the net, maybe between games of GT5P when you are getting mad at that lag :)

Read up a bit, learn it, and I bet in short of a week you will have it all figured out and see the error of your ways.

I just hope that what you come up with is that your image is really better but just not why you thought, becuase it would suck to educate yourself so much you realize you were just imaginging it's better :D

The cable is the only thing that i changed, what else could it be, assuming it's not my imagination as you seem to think. I'm grateful you are trying to explain all this technical stuff to me, but it really is of no importance. What i see with my own two eyes is what i'm going to believe at the end of the day. Science is always recorrecting itself, or finding new ''bits'' to the puzzle, no pun intended lol.

I know how it works in theory, maybe not to the same extent as you, but i understood everything you said. I am in no way as retarded as the person you made an analogy of. I know a fair bit more than the average Joe than i should in the area that's for sure, especially considering my main course of study is design and engineering.
 
The cable is the only thing that i changed, what else could it be, assuming it's not my imagination as you seem to think. I'm grateful you are trying to explain all this technical stuff to me, but it really is of no importance. What i see with my own two eyes is what i'm going to believe at the end of the day. Science is always recorrecting itself, or finding new ''bits'' to the puzzle, no pun intended lol.

I know how it works in theory, maybe not to the same extent as you, but i understood everything you said. I am in no way as retarded as the person you made an analogy of. I know a fair bit more than the average Joe than i should in the area that's for sure, especially considering my main course of study is design and engineering.

Sorry, that example was pretty extreme, I didn't mean to make you out that bad...

The thing is that if you undestand everything I say, that means you can comprehend it (some people can't) and that just is all the more reason you should take it on yourself to learn it rather than waste a brain that can understand it debating it while you haven't even learned it yet :)

Anyhow, take taht example with a grain of salt... it was meant in a more analogous way :D RAM and Ghz is a pretty stupid thing, but learning where to store pictures is pretty simple too. HDMI transmission protocal is more complicated, but it's the same process to learn so I am not inferring you are that stupid... just that you would do well to do so before making statements about it.

Again, as for what could be causing it... who knows... the first step is always to double blind it and make sure it's really there, from then on out, it's learning the details and eliminating what's not possible until you end up with (no matter how improbable) the solution! I ripped that off from a famous English guy :P

The only thing I can think of (and this is a HUGE stretch, I pretty much doubt it) is that you are getting a LOT of really TINY sparkles... maybe your cable is just the right crappiness to cause so many sparkles that it give a grainy appearance to stuff and the blacks have tons of little sparkles which (like noise on a high ISO camera picture) greys them out a bit.

I dunno... that seems unlikely as I think the picture would just drop, but who knows...

This kind of reminds me of a funny story I heard... some satellite scientists were listening for radio waves and everynight at 6pm the would get exactly 6 minutes of perfect sine waves... it was regular and on time every day. The building was shielded so it wasn't outside interference and the sine wave was unmistakable... the strenght, the timeliness of it... it had to be extra terestrial life contacting us!

Turns out the security guard had snuck a microwave in and was heating up his frozen dinners every night at 6pm for 6 minutes.
 
Sorry, that example was pretty extreme, I didn't mean to make you out that bad...

The thing is that if you undestand everything I say, that means you can comprehend it (some people can't) and that just is all the more reason you should take it on yourself to learn it rather than waste a brain that can understand it debating it while you haven't even learned it yet :)

Anyhow, take taht example with a grain of salt... it was meant in a more analogous way :D RAM and Ghz is a pretty stupid thing, but learning where to store pictures is pretty simple too. HDMI transmission protocal is more complicated, but it's the same process to learn so I am not inferring you are that stupid... just that you would do well to do so before making statements about it.

Again, as for what could be causing it... who knows... the first step is always to double blind it and make sure it's really there, from then on out, it's learning the details and eliminating what's not possible until you end up with (no matter how improbable) the solution! I ripped that off from a famous English guy :P

The only thing I can think of (and this is a HUGE stretch, I pretty much doubt it) is that you are getting a LOT of really TINY sparkles... maybe your cable is just the right crappiness to cause so many sparkles that it give a grainy appearance to stuff and the blacks have tons of little sparkles which (like noise on a high ISO camera picture) greys them out a bit.

I dunno... that seems unlikely as I think the picture would just drop, but who knows...

That last bit actually seems like it. Like i said it seemed to have more grainy appearance and the edges of things didn't seem as crisp. there is definatley specles, but when i look at them, they are so small i would have assumed they would be negligable.

When i first tested the two cables, i was looking at the F4O in the garage, the red of the F40 seemed more deep, not pink, and the blacks more black. There was less sparkling around the wheels, the sharpness of them seemed to improve quite some. And overall less noise, so maybe that is the reason, it does seem logical.
 
When i first tested the two cables, i was looking at the F4O in the garage, the red of the F40 seemed more deep, not pink, and the blacks more black. There was less sparkling around the wheels, the sharpness of them seemed to improve quite some. And overall less noise, so maybe that is the reason, it does seem logical.
One was 1.0 or 1.1 and the other is 1.3?
 
One was 1.0 or 1.1 and the other is 1.3?

I really wanna say that just feels more right, although to be honest, my experience with using 1.1 cables on a 1.3 device is nill... I had 1.3 cables way before I had 1.3 devices so I am not a good person to figure out if the description of the degradation matches what really happens...
 
HDMI 1.0

Released December 2002.

* Single-cable digital audio/video connection with a maximum bitrate of 4.9Gbps. Supports up to 165Mpixels/sec video (1080p60Hz or UXGA) and 8-channel/192kHz/24-bit audio.

[edit] HDMI 1.1

Released May 2004.

* Added support for DVD Audio.

[edit] HDMI 1.2

Released August 2005.

* Added support for One Bit Audio, used on Super Audio CDs, up to 8 channels.
* Availability of HDMI Type A connector for PC sources.
* Ability for PC sources to use native RGB color-space while retaining the option to support the YCbCr CE color space.
* Requirement for HDMI 1.2 and later displays to support low-voltage sources.

[edit] HDMI 1.2a

Released December 2005.

* Fully specifies Consumer Electronic Control (CEC) features, command sets, and CEC compliance tests.

[edit] HDMI 1.3

Released 22 June 2006.[7] [8]

* Increases single-link bandwidth to 340 MHz (10.2 Gbps)
* Optionally supports 30-bit, 36-bit, and 48-bit xvYCC with Deep Color or over one billion colors, up from 24-bit sRGB or YCbCr in previous versions.
* Incorporates automatic audio syncing (lip sync) capability.
* Supports output of Dolby TrueHD and DTS-HD Master Audio streams for external decoding by AV receivers.[9] TrueHD and DTS-HD are lossless audio codec formats used on HD DVDs and Blu-ray Discs. If the disc player can decode these streams into uncompressed audio, then HDMI 1.3 is not necessary, as all versions of HDMI can transport uncompressed audio.
* Availability of a new mini connector for devices such as camcorders.[10]
* The Sony PlayStation 3 is the first product available on consumer market with HDMI 1.3.[11]
* Epson has released the EMP-TW1000 as the first display supporting 30-bit deep color.[12]

[edit] HDMI 1.3a

Released 10 November 2006.[13]

* Cable and Sink modifications for Type C
* Source termination recommendation
* Removed undershoot and maximum rise/fall time limits.
* CEC capacitance limits changed
* RGB video quantization range clarification
* CEC commands for timer control brought back in an altered form, audio control commands added.
* Concurrently released compliance test specification included.

By the time i got PS3 chances are 1.3 is the norm.
 
Didn't bother to read the whole thread but here are my own impressions on the matter, GT5 lag has nothing to with game itself but TV/monitor being used!

I have 40" Sony D3500, and through HDMI every game is unplayble. We are talking about 50-100ms OUTPUT lag or something like that. My tv doesn't have "GAME MODE" or similar so it does all the processing and then displays the picture, very late.
Through VGA connection from my PC, input lag is reduced significantly but TV only supports 1360x768...

So I bought 200€ Benq 24" 16:9 LCD display (native resolution of 1920x1080).
It has DVI/VGA/HDMI.
Playing GT5P through my Benq is completely LAG FREE. Also when monitor is put close to G25 chassis, you get really nice immersion with 24" screen.

I will never go back playing with my Sony TV. I know there are newer models from Sony and Samsung but they have exactly same issues even with game modes.
 
Last edited:
I have 40" Sony D3500, and through HDMI every game is unplayble. We are talking about 50-100ms OUTPUT lag or something like that. My tv doesn't have "GAME MODE" or similar so it does all the processing and then displays the picture, very late.
Through VGA connection from my PC, input lag is reduced significantly but TV only supports 1360x768...

So I bought 200€ Benq 24" 16:9 LCD display (native resolution of 1920x1080).
It has DVI/VGA/HDMI.
Playing GT5P through my Benq is completely LAG FREE. Also when monitor is put close to G25 chassis, you get really nice immersion with 24" screen.

I will never go back playing with my Sony TV. I know there are newer models from Sony and Samsung but they have exactly same issues even with game modes.

Depends which model you buy.

These two in the links are 0-10 in game mode, but some other sony TV's these days are 20-30 in game mode, without 40-50. Older sets without game mode were 50-80.

http://www.hdtvtest.co.uk/Sony-KDL40W4000/Picture-Quality/
http://www.hdtvtest.co.uk/news/sony-kdl40w4500-20081116135.htm

Same tests have shown some samsungs stay 30-50 whether game mode is on or off. But a lot of TV's and some samsungs are now around 20 in game mode. A recent samsung tested 93ms without game mode so you have to be careful what you buy.

It really does vary but you can easily buy a HDTV that's below 20 in game mode for the last 2-3 years if you look. Also very easy to get lumbered with a new TV that's 50-100 whatever you turn on and off.

--------------

dancardesigner, you can't get deeper reds and blacks or improve your picture on a PS3 game or current bluray by buying a monster cable or a 1.3 cable or 1.3 sockets.

Deep Color is tech for 10bit/12bit/16bit monitors and media. 10bit and x.v.color is not supported for PS3 games or bluray.

You need to buy a 10bit monitor/HDTV thats just been released and a camera that records in 10bit to see deep color which is 30bit per pixel in total or use other highend equipment to playback.

* Optionally supports 30-bit, 36-bit, and 48-bit xvYCC with Deep Color.

30bit per pixel is 10bit per primary colour and 10bit is what's used in marketing for the new gen TV just come out. 36bit per pixel is 12bit per primary color and so on. Nearly all displays are 8bit.

Cable in the 1.2 era will easily cope with an 8bit PS3/PC/Bluray and also pass 1.3 tests. You'd need some really highend equipment to test out a 1.2 spec.
 
Last edited:
Just a friendly hint: don't you guys want to drop this issue (since it's not going to get any of you anywhere) and get back to topic? ;)
 
Last edited:

Latest Posts

Back