I'm already playing in 1080p. What possible advantage could HDMI provide? Is there ANY reason for me to upgrade AT ALL? No, there isn't. So my system isn't quite so obsolete, is it? Fact is, a lot of folks out there with 1080p TVs are currently playing 360 in 1080p.
You'd be singing a different tune if Sony went with their original plan and left HDMI out of the 20GB.
Fact is, HDMI isn't needed near as bad as you seem to think it is. It's turning into one of those "Wow, I've got to have that" things. Most people don't even know what the hell HDMI is, or what it's for, but dammit, they want it, and they'll pay out the nose for it. And they won't even know why.
Well, I DO know what HDMI is, and what it's for. And my Xbox360 does NOT need it.
I'm not so convinced you do fully understand what HDMI has to offer. Picture quality is not just about the resolution of the signal, though even there, with its superior 10.2 Gbps bandwidth, HDMI supports much higher resolution signals than component.
Not only does HDMI maintain an uninterrupted digital path to the display
(thus no D/A artifacts), but it supports upto 48-bit RGB or YCbCr color depths and xvYCC color standards.
Then there are the obvious advantages of its audio capabilities including automatic audio lip syncing and Dolby TrueHD and DTS-HD Master Audio streams for external decoding.
I'm sorry, but HDMI, especially 1.3, is a major step forward in performance capability over component.
HDMI is only there for that copy-protected crap, and maybe a few extra oohs and aahs from techies.
There's a little more to it than that.
...I'm not too worried. I'll be happy when I buy my $500 HDTV, run her at 720p, and just enjoy the damn game or the damn movie instead of worrying how pretty it is, or isn't. Because as we all know, unless you are really, really paying attention, most average people aren't going to see the difference between 720p, 1080i, 1080p, etc.
If someone can't tell the difference between watching native 1080p video on a 1080p display vs a 720p display they should immediately make an appointment with an ophthalmologist! Native 1080p video has more than twice the resolution of 720p, and yet even at 1080p, it doesn't match the amount of detail the human eye can see.
...And added to that, all of this is irrelevant anyway. Now that there will be 2160p LCD screens on the market (
LINKY), 1080p is simply worthless... NOW I NEEDZA PSFOOOORRRR!!!
Well besides several inaccuracies in that article, as much as I am a huge supporter of 4K film & video
(+3840x2160), it is going to be many years before any studio is going to release any of their films in 4K, other than for commercial theatrical venues. Its going to be even longer, perhaps decades before we EVER see any 4K video broadcast by the networks.
So for consumers, there wont be much of an advantage in owning displays with resolutions greater than 1920x1080. That said, I do have a friend in Dallas who bought Sony's SRX-R105, which is a 4K
(4096x2160) SXRD
(LCoS) projector, but he is a special case. Not only does he have a huge home theater, and even owns a very nice 35mm film projector, but he has access to some 4K film transfers, something your average consumer wont ever likely be able to get their hands on.
1920x1080 displays will become the norm and will remain that way for many years to come seeing as the vast majority of HD video content will not exceed 1920x1080 for a very long time.
I think the addition of HDMI -- if true -- is mostly marketing. There is the possibility that it would help with HD-DVD playback by allowing the HD flag crap, but the image quality improvement are debateable at best. It's a game console: the video processing is supplied by whoever could supply large volumes for the cheapest price. A good Toshiba unit (a company not exactly known for hi-fi video) should outperform it.
I'd give that half a look if it wasn't from Westinghouse.
This is 2K.
Good points. While no one can effectively argue against the benefits of HDMI, if the original video content isn't good, HDMI isn't likely going to make it look all that much better compared to VGA or component. However, just the advanatge of avoiding the XB360's D/A converter and the need for a secondary A/D conversion for digital displays should improve the picture quality.
I also share your same concerns regarding Westinghouse... They even make RCAs look good and JVC's appear reliable by comparison.
Duċk;2533155
The X360 is all analog. How can a digital output (HDMI) be added to it when it won't be compatible? I would think they'd have to mess with the GPU...
Thankfully the XB360 is not all analog, only the outputs are. The graphics are by their nature all digital. In fact, the XB360 should look a little better with HDMI simply because it would avoid having to convert the digital data to analog for the output signal, and if the display is also digital, it avoids a second A/D conversion!
In addition, if MS supported higher bit rates and color depth, then an XB360 with HDMI would look significantly better!
What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution. So if you have a 1080p TV, it should support a VGA resolution of 1920x1080. If it doesn't, you got ripped. If you have a 1080p TV without VGA, you also got ripped.
While I mostly agree, one could also say that in some cases those who bought a Westinghouse TV also "got ripped".
The only time HDMI/DVI is actually required for 1080p is for playback of movies from HD-DVD or Blu-ray. Games don't have the copy-protection on their video outputs the way hi-def movies do. Don't let that fact confuse you. One 1080p is not the same as another.
Don't count on it. HDCP can be applied to games as well as downloaded and broadcast video content. However, the improved performance of HDMI should be enough of a reason to want it regardless of current and future use of HDCP content.
In regards to these "2K" monitors... it's utterly pointless at this time. 1080p is the standard, and it's here to stay for a very, VERY long time.
2K is 1920x1080. I believe you mean 4K, and in that case I mostly agree with you, but I wouldn't go so far as to say that it's utterly pointless. There are some benefits depending on the use. And for those that can access 4K content, then obviously it wouldn't be pointless.
Long before flat-panel monitors even existed, 1920x1080 was the standard resolution for a digital film master. When a film was digitally scanned for processing, visual effects, etc, it was done at 1080.
While 1920x1080 is a standard for some digital film masters, it certainly isn't the most common, nor is it the current reference standard. The most common is still 720x480 for DVD, of which hundreds of thousands of masters have been made. As for the reference quality standard, that has been, and continues to be 4K masters, not 2K.
All-digital films, like Pixar's CGI movies, are rendered at 1920x1080 (or thereabouts, due the differences in aspect ratios.. the master render composition for Cars was full-screen, 1920x1440, and later cropped). The recent Star Wars movies? 1920x1080. The HD masters of the Original Trilogy? 1920x1080.
No, not all digital films are rendered at 1920x1080. There are now several 4K digital motion cameras being used, and for CGI, 4K renders have been used. Yes, the live action sequences in SW: ROTS were shot in 1920x1080
(1080p/24 4:4:4 RGB) with Sony's HDC-F950 cameras, which is also why they have a slightly "soft" look due to the lower detail resolution compared to images captured on 35mm film. This is why the film industry, directors, and cinematographers are eagerly waiting for cost effective 4K digital motion cameras, as 2K digital cameras simply cannot match the detail resolution of decent 35mm cameras.
Also, don't let the "K" fool you, either... the pixel size of a film master is based on it's horizontal resolution, not it's vertical. 1920x1080 actually IS what Hollywood refers to as "2K".
Correct! Although I and others have long argued that they should really drop the "K", and use "M" in terms of total pixel resolution. Such that 1920x1080 would be 2M (~2 million/mega pixels) and not 2K, and 3840x2160 would be 8M (~8 million/mega pixels) and not 4K. By reffering to total resolution instead of just horizontal resolution it gives a more accurate representation of a display's resolution. This would also prevent any confusion of what 2K and 4K refer to, and keep some from thinking that 4K is only twice the resolution of 2K, which of course it is not.
On VERY rare occasions, a film will be scanned and mastered at a higher resolution.
4K is actually much more popular than you might realize. There are now at least four different production companies, as well as studio production houses that are doing 4K film scanning.
For example, when they recently did the remaster/restoration of Wizard of Oz and King Kong, the negatives were scanned at 4K resolution (4096x3072). However, the final "masters" for the DVDs were done at 1920x1080.
The DVD master is 720×480
(720x560 for the PAL release). The HD DVD master is 1920x1080.
The high-res scan was done only to make sure they got every last scrap of information off the film negative, largely because this was the last opportunity they were going to have. Also, when shrinking the image down to 1080, it helps reduce the level of visible film grain.
Even 4K scans cannot capture every scrap of detail that a quality 35mm motion picture camera can capture. Fortunately this is NOT the last opportunity to scan film negatives. Unlike old cellulose nitrate film stock, modern polyester film stock is easier to preserve, which is good because it is estimated that for many films, they will need to be scanned at over 15 megapixels
(twice that of 4K scanners) in order to at least capture all the original detail. Of course resolution is but one of the factors needed to preserve film. The other problem with today's film scanners is that they are still limited in their ability to capture all the color depth and radiometric resolution of film. They are close, but not there yet.
And don't badmouth Westinghouse, either... I have one, and I love it.
I'm sorry, but Westinghouse's bad reputation is one they have earned all on their own. I'm glad you are happy with yours, but that would be the exception and not the rule I am afraid. Thompson, Magnavox, Westinghouse, RCA, JVC, and LG. These name brands have been significantly hurt by a history of products of poor quality and/or poor reliability. That isn't to say everything they have made is bad. I particularly feel bad for JVC, as they have always excelled at innovation... just that they have not been good at implementing their designs, especially in manufacturing them.
HDMI is just DVI with audio. It's not, at least yet, the be-all-end-all connection. In fact, right now it has quite a significant drawback; signal strength and integrity over cables longer than 5m.
Wrong on all accounts. While HDMI and DVI are compatible, they are different, and in fact, HDMI is capable of longer runs than DVI. I've personally run 15m HDMI cable with no drop in signal strength, and I'm not even sure I know what loss of "integrity" you are referring to.
Bottom line: HDMI has a broader bandwidth than DVI, larger color depth than DVI, and can have longer cable runs than DVI.
That's where researching your purchase ahead of time comes into play. I made damn sure that my TV supported everything I wanted it to. It supports full 1080p over component, VGA, HDMI, and DVI.
Amen. However, your TV
(Westinghouse LVM-47w1) has processing errors for 1080p via its HDMI input:
Audioholics Review
We tested 1080p source input to the various connections on the LVM-47w1. Like the 42-inch model, the HDMI input was apparently throttled down to only (correctly) accept 720p/1080i inputs. When we fed all three digital inputs 1080p source material the DVI-D inputs didn’t produce the sparkle artifacts, but the HDMI input did. We’ve blown up the sparkle effects below and though they were not present in every scene, they certainly indicate that this display is not fully 1080p compliant on the HDMI inputs.
It also pays to make sure your display does real deinterlacing and 3:2 pulldown conversion for interlaced signals and film content.
For instance, the Westinghouse LVM-47w1 only deinterlaces through the HDMI input. 1080i signals through the DVI, VGA, and component inputs are scaled from 1920x540 to 1920x1080. This is why many owners of this and the 42" model complain about the "fuzziness" on vertical resolution when comparing 1080i through those inputs to that of the HDMI input. On AVS, someone posted pixel mapping test data that showed the Westinghouses lost 50% of the vertical detail from 1080i sources through those inputs.
While I'm glad the LVM-47w1 worked out for you. Personally, I found its poor contrast
(~ 450:1 calibrated ANSI), black level crush, noticable pixel structure, poor grayscale calibration tools, component overscan, and lack of any advanced scaling modes for SD to seriously detract from the quality of this display. Most decent in-depth reviews appear to share a similar opinion. Audioholics' highly respected HQV Bench Tests gave the LVM-47w1 a score of only 68 out of a possible 130. 👎
It failed several of the test requirements like:
- Processor Jaggies #2
- Motion adaptive Noise Reduction
- Cadence 2:3:3:2 DV Cam
- Cadence 3:2:3:2:2 Vari-speed
- Cadence 5:5 Animation
- Cadence 6:4 Animation
- Cadence 8:7 animation
And only marginally passed the Jaggies #1, Scrolling Horizontal, and Flagging tests.
However, it is less expensive than the Sony Bravias and Sharp Aquos, and if you could negotiate a significantly lower price then it could be a good value choice that's for sure... assuming you get a trouble free unit.
What you're referring to is the ICT (Image Constraint Tag), that will reduce the resolution of HD signals down to 480p over any analog connection.
Actually it would be scaled down to 540p
(1/4 HD).
Iceburns has it right.. HDMI is just a connection. It's a wire, nothing more. It's not some sort of super technology that's going to revolutionalize the video industry. The only reason it's so popular right now is because companies like Sony are touting that it's the only "true high definition" connection out there, and they're even restricting their own hardware to make it seem like the truth, by limiting 1080p video to HDMI-only.
WRONG. Please read up on HDMI before making such misleading comments. As mentioned earlier, HDMI is very different than component and VGA, and offers significant improvements.
(Whew... My fingers are tired.)