CAMAROBOY69
Not sure if you have viewed this thread at all but I have already tried 4 tvs.
https://www.gtplanet.net/forum/showthread.php?t=237042
You said in one of the threads that you have a Plasma. I heard thats the best for gaming but also heard a lot of bad things about plasma. What is your feedback on your plasma? So far out of the 4 tvs I have tried, I like the LED Sony the best but it was only 60hz which created a ton of motion blur and jagginess with any motion. Other than the 60hz I love the Sony Bravia. The samsung was 120hz and the motion blur was handled very well with games But had strange choppiness in movies.
Problem is that the only 120hz Sony they have is 40". So I am bringing my 360 to ABC warehouse on Friday and plan to check the 40" Sony LED tv and probably some plasmas while I am there. Not sure if Sony makes a plasma but I really like the Sony bravia a lot.
Well, at the time I purchased my Plasma, LED's weren't even around yet. It was LCD, DLP and Plasma. This is a Pioneer Elite 60". It still has an amazing picture with some of the best contrasting colors and deepest blacks I have seen, even still...and this TV is 4+ years old. I will say, for motion blur, it can't even come close to the 120mhz or 240mhz TV's that are out there now.
There is definite motion blur going on with fast paced movies. I don't notice it as much with gaming. Jagged edges are pretty much non-existent when I am gaming on it from my computer, however, the 360/PS3 is awful in comparison. All three are HDMI inputs into my receiver and then to my TV so they are all on the same signal path. The only main difference is raw computing power and what my computer graphics is capable of doing compared to the 360/PS3. Quite honestly, I have just about quit playing all my console games just because it looks low-res to me now, even 1080p titles. Some titles do look better than others however. Drakes Fortune I remember looking pretty clean with very little jaggies as well as GOW, but it has been ages since I've fired up either one. My guess is if I were to look at it now I would say, "EEEK! What is up with all the JAGGIES???".
Like I said in the other thread, I am gaming on my 27" computer monitor at a 2560x1440 resolution. I actually turn off all my FSAA as it really isn't required at those resolutions. In my opinion, the next step in HD standard would be a larger resolution like that. TV's will have to follow. Since movies, namely Blu-rays seem to be immune to these graphical flaws, the only push for those higher resolutions is the gaming community. I just don't see the movie industry getting on board with the hyper super HD platform (other than IMAX maybe).
Gaming publishers need to have higher standards IMO. The sure fact that some titles on the PS3 look great while others look like pure horse poop irritates the living crap out of me. Take Witcher II for example. Here is a game that is using DirectX 9 for their graphics. This is in a industry with EVERYONE else is developing games using features built into DX11. My point is Witcher II is actually utilizing all the ability of the technology because they took the time and effort to do so.
I can only imagine that these other publishers are trying to crank out games just as fast and furious as they can as time is money.
You mentioned Jaggies on the larger televisions vs. your older TV that was smaller. There is an element of viewing distance. The further away you are, my reason, the smoother everything will look on those larger TV's.
Long story short, great idea taking your 360 in to test the TV's. Make sure you are view the TV in the warehouse the same average distance you would be viewing it at home to get an accurate idea of it will really look like.