Yet *another* Model of 360

I had the Game Gear for awhile before we got rid of it.

17.jpg
 

Please. :sly:
It would be better, but I haven't seen my Neo-Geo Pocket Color in 3 months (probably in my closet), and lord only knows where my Game Boy Color and black Game Boy Advance are.
 
I had a Virtual Boy. How many of you are gonna fess up to that little mistake?
 
I will. It's sitting in my closet. I got that Wario game, and some tennis game. Were there actually any other games for that headache inducing POS?
 
We had a pinball game that was pretty addictive (though I guess that's 'cause pinball is addictive to begin with). I had the tennis game too, which was pretty good.
 
Well actually my TV can't display 1080P thorugh component or VGA, just via HDMI. So yeah, I'll have an huge advantage from this new console and that's why I'll buy it at an heartbeat whatever it costs...

But for 95% HDMI is kind of useless indeed, at least atm...
 
I'm talking about the system itself, not its peripherals. A Core 360 can be upgraded to a Premium. A 20GB PS3 cannot. Things are built-in to the 60GB that cannot be built-in to the 20GB.

Example: I can unhook the HDD from my 360, plug in a wired controller, and bam.. instant Core. You can't do that with a PS3. Hence, different hardware.

So tell me this, then. Why is my Xbox360 going to be "obsolete"? I'm already playing in 1080p. What possible advantage could HDMI provide? Is there ANY reason for me to upgrade AT ALL? No, there isn't. So my system isn't quite so obsolete, is it? Fact is, a lot of folks out there with 1080p TVs are currently playing 360 in 1080p.

You'd be singing a different tune if Sony went with their original plan and left HDMI out of the 20GB.

Fact is, HDMI isn't needed near as bad as you seem to think it is. It's turning into one of those "Wow, I've got to have that" things. Most people don't even know what the hell HDMI is, or what it's for, but dammit, they want it, and they'll pay out the nose for it. And they won't even know why.

Well, I DO know what HDMI is, and what it's for. And my Xbox360 does NOT need it.

That's not you're saying though.

You're implying both "Packages" are the same when they have the exact same hardware. They don't.
 
When I think about this more, this really doesn't seem all that possible.

The X360 is all analog. How can a digital output (HDMI) be added to it when it won't be compatible? I would think they'd have to mess with the GPU...

Another thing is that MS probably has tons of one-off prototype X360 models that have some extra thing the standard production doesn't. This could very easily be one of them.

I know that a revised X360 is on it's way (65nm processor), but I doubt it'll include HDMI and a 120GB HDD.
 
Well actually my TV can't display 1080P thorugh component or VGA, just via HDMI. So yeah, I'll have an huge advantage from this new console and that's why I'll buy it at an heartbeat whatever it costs...

What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution. So if you have a 1080p TV, it should support a VGA resolution of 1920x1080. If it doesn't, you got ripped. If you have a 1080p TV without VGA, you also got ripped.

The only time HDMI/DVI is actually required for 1080p is for playback of movies from HD-DVD or Blu-ray. Games don't have the copy-protection on their video outputs the way hi-def movies do. Don't let that fact confuse you. One 1080p is not the same as another.

In regards to these "2K" monitors... it's utterly pointless at this time. 1080p is the standard, and it's here to stay for a very, VERY long time. Long before flat-panel monitors even existed, 1920x1080 was the standard resolution for a digital film master. When a film was digitally scanned for processing, visual effects, etc, it was done at 1080. All-digital films, like Pixar's CGI movies, are rendered at 1920x1080 (or thereabouts, due the differences in aspect ratios.. the master render composition for Cars was full-screen, 1920x1440, and later cropped). The recent Star Wars movies? 1920x1080. The HD masters of the Original Trilogy? 1920x1080.

Also, don't let the "K" fool you, either... the pixel size of a film master is based on it's horizontal resolution, not it's vertical. 1920x1080 actually IS what Hollywood refers to as "2K".

On VERY rare occasions, a film will be scanned and mastered at a higher resolution. For example, when they recently did the remaster/restoration of Wizard of Oz and King Kong, the negatives were scanned at 4K resolution (4096x3072). However, the final "masters" for the DVDs were done at 1920x1080. The high-res scan was done only to make sure they got every last scrap of information off the film negative, largely because this was the last opportunity they were going to have. Also, when shrinking the image down to 1080, it helps reduce the level of visible film grain.

Basically, a TV with a higher resolution than 1080 is pointless. There won't be anything to display on such a TV for at least ten or fifteen years, if not longer. And by then, the TVs they're releasing now will be hopelessly obsolete.

And don't badmouth Westinghouse, either... I have one, and I love it. :)
 
I get 1080p through DVI on my receiver. I have the X360 VGA adapter and a VGA-DVI adapter.

HDMI is just DVI with audio. It's not, at least yet, the be-all-end-all connection. In fact, right now it has quite a significant drawback; signal strength and integrity over cables longer than 5m.

I also have HDMI output from my HD-DVD player anyways ;)
 
What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution. So if you have a 1080p TV, it should support a VGA resolution of 1920x1080. If it doesn't, you got ripped. If you have a 1080p TV without VGA, you also got ripped.
But that is exactly the problem. There are far too many HDTV's that will not accept 1080p via component or VGA even though their native resolution is 1080. I can think of several major companies that do not support 1080p over analog inputs like Sony. To simplify things HDMI will give you 1080p on a 1080 native tv which can not accept the signal any other way. We can't all own the same tv or the "right" tv. But then this wouldn't have been an issue if the 360 had an HDMI port in the first place.

Then there's DRM and HDCP on Blu-ray and HD DVD movies. Currently there is none. But who's to say the film companies won't demand it in 2 or 3 years time, there-by making the HD DVD add on useless? Well, there'd probably be a new Xbox by then anyway. Or it'll be all download/streamed HD movies.
I can see this HDMI 360 dividing peoples opinion and Microsoft won't get an easy ride.
 
As far as system revsions Sony's NA was built into PS2's 4 years after its release and it was a full system revision. Few people will care to upgrade to a 60GB PS3 because they dont need to. 20GB's gives you everything you need. Core 360's dont. Its a different kind of upgrade so its not a true comparison.

There hasnt been an official anouncment yet has there? But it may not be out until Halo3 is near release.
 
What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution. So if you have a 1080p TV, it should support a VGA resolution of 1920x1080. If it doesn't, you got ripped. If you have a 1080p TV without VGA, you also got ripped.

The only time HDMI/DVI is actually required for 1080p is for playback of movies from HD-DVD or Blu-ray. Games don't have the copy-protection on their video outputs the way hi-def movies do. Don't let that fact confuse you. One 1080p is not the same as another.

I have the Sony KDS 55 A 2000, which is one of the best 1080P TV's out there for under 5000 $. It could beat even 10000 $ Pioneer and 15000 $ Panasonic Plasmas in German hifi/tv magazines.
But the VGA/component inputs do not support 1080P, just i. HDMI does, which is the important thing ( DVD player, cable receiver, PS3 > HDMI ) and oh well, the Xbox 360 looks excellent in 1080i aswell, 2,5 sec reaction time, contrast 10000:1, 55 inch, I couldn't be happier. And if this new version gets a release this year, everything is perfect ;)
 
But that is exactly the problem. There are far too many HDTV's that will not accept 1080p via component or VGA even though their native resolution is 1080. I can think of several major companies that do not support 1080p over analog inputs like Sony. We can't all own the same tv or the "right" tv.

That's where researching your purchase ahead of time comes into play. I made damn sure that my TV supported everything I wanted it to. It supports full 1080p over component, VGA, HDMI, and DVI. The HDMI and both DVI ports also offer full HDCP compliance.

slackbladder
Then there's DRM and HDCP on Blu-ray and HD DVD movies. Currently there is none.

Oh yes there is. It's why the 360 can't do 1080p HD-DVD over component. It's a legal thing, not a hardware restriction. They got lucky and found a loophole that allows them to do it over VGA, though.

What you're referring to is the ICT (Image Constraint Tag), that will reduce the resolution of HD signals down to 480p over any analog connection. But again, that's only for movies. I don't plan on ever using my 360 to watch movies. I'll get a standalone player for that, and use either the HDMI or DVI ports on my TV.

Iceburns has it right.. HDMI is just a connection. It's a wire, nothing more. It's not some sort of super technology that's going to revolutionalize the video industry. The only reason it's so popular right now is because companies like Sony are touting that it's the only "true high definition" connection out there, and they're even restricting their own hardware to make it seem like the truth, by limiting 1080p video to HDMI-only. Which is just them being sneaky and cheap, since there's no such actual restriction on any of the other inputs. It's only restricted because they made it that way.
 
Saw that on another forum. Surprisingly, it doesn't really prove much.

There are several things to consider:

1) The input information on the TV is never shown. Something pops up in the top left corner of the screen, but you can't read it. It would have been better for him to bring up the TVs system/input information screen, and shown the actual input signal that's being used. i.e. "HDMI 1920x1080p".

2) While he apparently goes to great "lengths" to show that the DVI connector is the only one being used, he most certainly does not show the entire back of the unit, nor what the other cable is that's briefly seen off to the right side of the monitor. Only somone with that exact make and model of monitor would know if all of the connectors are side by side, or if there could be another connector elsewhere.

Maybe I'm simply jaded because of my knowledge of filmmaking. It's surprisingly easy to make the audience see what you want them to see, especially when the audience wants to see it that way.

Maybe when I get a PS3, I'll make a video "proving" that you can play a PS3 game on an Xbox360, or vice versa.

Also, consider that if such a device exists, it exists only in the R&D labs at Microsoft. And very few people have access to it. If it were real, and you did post a video on the internet, your ass would be fired the instant you showed up to work on Monday.

Moreover, there's no proof that this is actually meant for production. It could be nothing more than a concept, an experiment. It could be made solely for comparison purposes, to see if a HDMI-equipped system really offers any benefit over the standard one.

I'll believe it when I hear it from Microsoft's own mouth. Not one minute before.
 
Sounds great idea, 120GB isn't too much? :lol: What can we do when we already have the old model of the Xbox 360?

Well... you could do 1080p on some TVs like in Max_DC's case, and you can store many more demos and videos (this'd be great for the movie service). But other than that, there really aren't many new things you could do with it.
 
The CES keynote just ended, and guess what? Not one mention of any sort of hardware change for Xbox360.

The moral? Don't believe everything you see on the 'net.
 
What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution.

Not necessarily. There's lots of cheap 1080p displays to be found at Costco and Best Buy.... Although I find it shocking that the Sony SXRD doesn't allow 1080p over even VGA. Must have something to do with cheapening out on A/D converters.

Jedi2016
The only time HDMI/DVI is actually required for 1080p is for playback of movies from HD-DVD or Blu-ray. Games don't have the copy-protection on their video outputs the way hi-def movies do.

With Microsoft involved, never say "never" to any bad idea.

Jedi2016
In regards to these "2K" monitors... it's utterly pointless at this time. 1080p is the standard, and it's here to stay for a very, VERY long time. Long before flat-panel monitors even existed, 1920x1080 was the standard resolution for a digital film master. When a film was digitally scanned for processing, visual effects, etc, it was done at 1080. All-digital films, like Pixar's CGI movies, are rendered at 1920x1080 (or thereabouts, due the differences in aspect ratios.. the master render composition for Cars was full-screen, 1920x1440, and later cropped). The recent Star Wars movies? 1920x1080. The HD masters of the Original Trilogy? 1920x1080.

Long before HD became a standard for US broadcast, 2K meant 2048x1536. (You even alluded to this when you mentioned 4K is 4096). You'll also notice this on older D-ILA projectors that pre-date common HD broadcasts. Many films are mastered & edited at 1080p because it is easier to translate directly to HD broadcasts. I can only imagine the headaches involved when trying to cross-convert 2048 to 1920.... As for whether Pixar films like "Cars" were rendered in 1080i/p, it wasn't. I doubt it was even "mastered for DVD" in 1920, since they can simply render straight to DVD resolution. Faster, easier, less errors. I think you need this.

Jedi2016
Iceburns has it right.. HDMI is just a connection. It's a wire, nothing more. It's not some sort of super technology that's going to revolutionalize the video industry.

No, the real reason is to vastly simplify the installation process. One wire per device: one from the DVD player to the receiver, one from your cable box, one from the XBOX/PS3, one to the TV, and in the darkness bind them. :odd:

Sorry, got lost for a minute there....


Jedi2016
And don't badmouth Westinghouse, either... I have one, and I love it. :)

Good luck, then. They haven't had quality products in over 40 years. They do much better at running CBS.


Duċk;2533155
The X360 is all analog. How can a digital output (HDMI) be added to it when it won't be compatible? I would think they'd have to mess with the GPU...

I defy you to make less sense. :odd: Licensing aside, it's actually cheaper & easier to convert to HDMI. No D/A converters, no re-clocking, no loss of image quality (theoretically). You just convert one digital video format (raw from the video buffer) to another (HDMI digital YCrCb/etc.).

Another thing is that MS probably has tons of one-off prototype X360 models that have some extra thing the standard production doesn't. This could very easily be one of them.

That's certainly true. I hope it's not, but it's probably a safer bet. Definitely safer than saying the prototype had a built-in HD-DVD player. :lol:
 
One reason that MS didn't annouce anything, could be, that they didn't want to piss off all those people who just bought their new 360 in the holidays... it might still be released in autumn and there are enough conventions this year to annouce it...

@Sony TV's and 1080P only via HDMI. Well, personally I think they just wanted to push HDMI, a technology they coinvented and make it clear that HDMI is the future. Tbh, my partivular TV totally rocks and even 1080i looks awesome ao I can perfectly live it, especially when there is a solution on the horizon.
This TV can do 1080P24, 55 inch, 2,5 sec reaction time, contrast 10000:1 so I won't complain...
 
As for whether Pixar films like "Cars" were rendered in 1080i/p, it wasn't. I doubt it was even "mastered for DVD" in 1920, since they can simply render straight to DVD resolution. Faster, easier, less errors.

I didn't base that statement off of articles, I based it off of raw renders that I've seen. The composition entitled "render_comp_master" is at 1920x1440 resolution, full-screen ratio. The comp entitled "film_comp" is 1920x803, the film's final 2.39:1 aspect ratio. And there are indications in the image that it was rendered at that resolution, or very close to that resolution. Publicity images were rendered at 4K, sans motion blur, but parts of the image don't hold up at that resolution because they were designed for 2K.

And there's no way in hell they re-rendered the entire film in 720x480 for the DVD. For one, it would require them to completely re-composite the film, since settings for compositing and effects don't translate well when the resolution is changed. That's a HELL of a lot of work, which is utterly pointless when they already have a film master at HD resolution that they can simply resize.

Secondly, rendering it in that fashion would cause a large loss of quality. Ever watch Final Fantasy VII: Advent Children on a HDTV? It's fugly. Precisely because they rendered it at DVD resolution. Flickering in the hair filters and procedural textures, aliasing on just about everything.. it just doesn't hold up when viewed on a non-SD television. Yet I can pop in any other CGI film that saw a high-res theatrical release, and none of those problems are there. Believe me, rendering at SD is not worth it. I've done it, and it sucks. Even when my final output is SD, I still render and composite in a higher resolution and then shrink it down.
 
Saw that on another forum. Surprisingly, it doesn't really prove much.

There are several things to consider:

1) The input information on the TV is never shown. Something pops up in the top left corner of the screen, but you can't read it. It would have been better for him to bring up the TVs system/input information screen, and shown the actual input signal that's being used. i.e. "HDMI 1920x1080p".

2) While he apparently goes to great "lengths" to show that the DVI connector is the only one being used, he most certainly does not show the entire back of the unit, nor what the other cable is that's briefly seen off to the right side of the monitor. Only somone with that exact make and model of monitor would know if all of the connectors are side by side, or if there could be another connector elsewhere.

Maybe I'm simply jaded because of my knowledge of filmmaking. It's surprisingly easy to make the audience see what you want them to see, especially when the audience wants to see it that way.

Maybe when I get a PS3, I'll make a video "proving" that you can play a PS3 game on an Xbox360, or vice versa.

Also, consider that if such a device exists, it exists only in the R&D labs at Microsoft. And very few people have access to it. If it were real, and you did post a video on the internet, your ass would be fired the instant you showed up to work on Monday.

Moreover, there's no proof that this is actually meant for production. It could be nothing more than a concept, an experiment. It could be made solely for comparison purposes, to see if a HDMI-equipped system really offers any benefit over the standard one.

I'll believe it when I hear it from Microsoft's own mouth. Not one minute before.
You cannot play PS3 games, because the hardware and game is different on both formats, also Xbox 360 uses games on DVD, while the PS3 uses Blu-Ray.
 
I didn't base that statement off of articles, I based it off of raw renders that I've seen. The composition entitled "render_comp_master" is at 1920x1440 resolution, full-screen ratio. The comp entitled "film_comp" is 1920x803, the film's final 2.39:1 aspect ratio. And there are indications in the image that it was rendered at that resolution, or very close to that resolution. Publicity images were rendered at 4K, sans motion blur, but parts of the image don't hold up at that resolution because they were designed for 2K.


And you saw this where? So now you work for Pixar or some other Disney partner?
 
You cannot play PS3 games, because the hardware and game is different on both formats, also Xbox 360 uses games on DVD, while the PS3 uses Blu-Ray.
I'm well aware of that. I was referring to how it would be possible to create a video that "showed" it could be done. Various tricks of displaying things on the TV, maybe some slight VFX here and there.

And you saw this where? So now you work for Pixar or some other Disney partner?
It's called a "press packet", or something to that effect. Shortly after the film was released, I had temporary access to insider press-release information from Pixar, I grabbed about a half a dozen images. Each had a small bit at the bottom saying where the image was taken from.

For example, at the bottom of an image of Lightning McQueen driving away from Radiator Springs after his "escape" from Bessie:

Cars/CAPS Image/Pixar Creative Services
generated from element: film_comp
c43_28cr.sel8.85.tif - 2005:10:07 14:39:39 - (1920x803)


And so on and so forth. I'm not certain what most of the numbers are across the bottom.. probably scene files and so on, divvied up by scene, shot, and frame number.

Because it was a private press release, I don't think it would be proper for me to post the images themselves, if that's what you're asking, although you may be able to find them out there somewhere.
 
I'm already playing in 1080p. What possible advantage could HDMI provide? Is there ANY reason for me to upgrade AT ALL? No, there isn't. So my system isn't quite so obsolete, is it? Fact is, a lot of folks out there with 1080p TVs are currently playing 360 in 1080p.

You'd be singing a different tune if Sony went with their original plan and left HDMI out of the 20GB.

Fact is, HDMI isn't needed near as bad as you seem to think it is. It's turning into one of those "Wow, I've got to have that" things. Most people don't even know what the hell HDMI is, or what it's for, but dammit, they want it, and they'll pay out the nose for it. And they won't even know why.

Well, I DO know what HDMI is, and what it's for. And my Xbox360 does NOT need it.
I'm not so convinced you do fully understand what HDMI has to offer. Picture quality is not just about the resolution of the signal, though even there, with its superior 10.2 Gbps bandwidth, HDMI supports much higher resolution signals than component.

Not only does HDMI maintain an uninterrupted digital path to the display (thus no D/A artifacts), but it supports upto 48-bit RGB or YCbCr color depths and xvYCC color standards.

Then there are the obvious advantages of its audio capabilities including automatic audio lip syncing and Dolby TrueHD and DTS-HD Master Audio streams for external decoding.

I'm sorry, but HDMI, especially 1.3, is a major step forward in performance capability over component.







HDMI is only there for that copy-protected crap, and maybe a few extra oohs and aahs from techies.
There's a little more to it than that.

...I'm not too worried. I'll be happy when I buy my $500 HDTV, run her at 720p, and just enjoy the damn game or the damn movie instead of worrying how pretty it is, or isn't. Because as we all know, unless you are really, really paying attention, most average people aren't going to see the difference between 720p, 1080i, 1080p, etc.
:lol: If someone can't tell the difference between watching native 1080p video on a 1080p display vs a 720p display they should immediately make an appointment with an ophthalmologist! Native 1080p video has more than twice the resolution of 720p, and yet even at 1080p, it doesn't match the amount of detail the human eye can see.

...And added to that, all of this is irrelevant anyway. Now that there will be 2160p LCD screens on the market (LINKY), 1080p is simply worthless... NOW I NEEDZA PSFOOOORRRR!!!
Well besides several inaccuracies in that article, as much as I am a huge supporter of 4K film & video (+3840x2160), it is going to be many years before any studio is going to release any of their films in 4K, other than for commercial theatrical venues. Its going to be even longer, perhaps decades before we EVER see any 4K video broadcast by the networks.

So for consumers, there wont be much of an advantage in owning displays with resolutions greater than 1920x1080. That said, I do have a friend in Dallas who bought Sony's SRX-R105, which is a 4K (4096x2160) SXRD (LCoS) projector, but he is a special case. Not only does he have a huge home theater, and even owns a very nice 35mm film projector, but he has access to some 4K film transfers, something your average consumer wont ever likely be able to get their hands on.

1920x1080 displays will become the norm and will remain that way for many years to come seeing as the vast majority of HD video content will not exceed 1920x1080 for a very long time.







I think the addition of HDMI -- if true -- is mostly marketing. There is the possibility that it would help with HD-DVD playback by allowing the HD flag crap, but the image quality improvement are debateable at best. It's a game console: the video processing is supplied by whoever could supply large volumes for the cheapest price. A good Toshiba unit (a company not exactly known for hi-fi video) should outperform it.

I'd give that half a look if it wasn't from Westinghouse. :yuck: This is 2K.
Good points. While no one can effectively argue against the benefits of HDMI, if the original video content isn't good, HDMI isn't likely going to make it look all that much better compared to VGA or component. However, just the advanatge of avoiding the XB360's D/A converter and the need for a secondary A/D conversion for digital displays should improve the picture quality.

I also share your same concerns regarding Westinghouse... They even make RCAs look good and JVC's appear reliable by comparison. :)







Duċk;2533155
The X360 is all analog. How can a digital output (HDMI) be added to it when it won't be compatible? I would think they'd have to mess with the GPU...
Thankfully the XB360 is not all analog, only the outputs are. The graphics are by their nature all digital. In fact, the XB360 should look a little better with HDMI simply because it would avoid having to convert the digital data to analog for the output signal, and if the display is also digital, it avoids a second A/D conversion!

In addition, if MS supported higher bit rates and color depth, then an XB360 with HDMI would look significantly better!







What kind of TV do you have? Even without component support (which is indeed rare), any TV with a VGA connection should absolutely include support for it's native resolution. So if you have a 1080p TV, it should support a VGA resolution of 1920x1080. If it doesn't, you got ripped. If you have a 1080p TV without VGA, you also got ripped.
While I mostly agree, one could also say that in some cases those who bought a Westinghouse TV also "got ripped". ;)

The only time HDMI/DVI is actually required for 1080p is for playback of movies from HD-DVD or Blu-ray. Games don't have the copy-protection on their video outputs the way hi-def movies do. Don't let that fact confuse you. One 1080p is not the same as another.
Don't count on it. HDCP can be applied to games as well as downloaded and broadcast video content. However, the improved performance of HDMI should be enough of a reason to want it regardless of current and future use of HDCP content.

In regards to these "2K" monitors... it's utterly pointless at this time. 1080p is the standard, and it's here to stay for a very, VERY long time.
2K is 1920x1080. I believe you mean 4K, and in that case I mostly agree with you, but I wouldn't go so far as to say that it's utterly pointless. There are some benefits depending on the use. And for those that can access 4K content, then obviously it wouldn't be pointless.

Long before flat-panel monitors even existed, 1920x1080 was the standard resolution for a digital film master. When a film was digitally scanned for processing, visual effects, etc, it was done at 1080.
While 1920x1080 is a standard for some digital film masters, it certainly isn't the most common, nor is it the current reference standard. The most common is still 720x480 for DVD, of which hundreds of thousands of masters have been made. As for the reference quality standard, that has been, and continues to be 4K masters, not 2K.

All-digital films, like Pixar's CGI movies, are rendered at 1920x1080 (or thereabouts, due the differences in aspect ratios.. the master render composition for Cars was full-screen, 1920x1440, and later cropped). The recent Star Wars movies? 1920x1080. The HD masters of the Original Trilogy? 1920x1080.
No, not all digital films are rendered at 1920x1080. There are now several 4K digital motion cameras being used, and for CGI, 4K renders have been used. Yes, the live action sequences in SW: ROTS were shot in 1920x1080 (1080p/24 4:4:4 RGB) with Sony's HDC-F950 cameras, which is also why they have a slightly "soft" look due to the lower detail resolution compared to images captured on 35mm film. This is why the film industry, directors, and cinematographers are eagerly waiting for cost effective 4K digital motion cameras, as 2K digital cameras simply cannot match the detail resolution of decent 35mm cameras.

Also, don't let the "K" fool you, either... the pixel size of a film master is based on it's horizontal resolution, not it's vertical. 1920x1080 actually IS what Hollywood refers to as "2K".
Correct! Although I and others have long argued that they should really drop the "K", and use "M" in terms of total pixel resolution. Such that 1920x1080 would be 2M (~2 million/mega pixels) and not 2K, and 3840x2160 would be 8M (~8 million/mega pixels) and not 4K. By reffering to total resolution instead of just horizontal resolution it gives a more accurate representation of a display's resolution. This would also prevent any confusion of what 2K and 4K refer to, and keep some from thinking that 4K is only twice the resolution of 2K, which of course it is not.

On VERY rare occasions, a film will be scanned and mastered at a higher resolution.
4K is actually much more popular than you might realize. There are now at least four different production companies, as well as studio production houses that are doing 4K film scanning.

For example, when they recently did the remaster/restoration of Wizard of Oz and King Kong, the negatives were scanned at 4K resolution (4096x3072). However, the final "masters" for the DVDs were done at 1920x1080.
The DVD master is 720×480 (720x560 for the PAL release). The HD DVD master is 1920x1080.

The high-res scan was done only to make sure they got every last scrap of information off the film negative, largely because this was the last opportunity they were going to have. Also, when shrinking the image down to 1080, it helps reduce the level of visible film grain.
Even 4K scans cannot capture every scrap of detail that a quality 35mm motion picture camera can capture. Fortunately this is NOT the last opportunity to scan film negatives. Unlike old cellulose nitrate film stock, modern polyester film stock is easier to preserve, which is good because it is estimated that for many films, they will need to be scanned at over 15 megapixels (twice that of 4K scanners) in order to at least capture all the original detail. Of course resolution is but one of the factors needed to preserve film. The other problem with today's film scanners is that they are still limited in their ability to capture all the color depth and radiometric resolution of film. They are close, but not there yet.

And don't badmouth Westinghouse, either... I have one, and I love it. :)
I'm sorry, but Westinghouse's bad reputation is one they have earned all on their own. I'm glad you are happy with yours, but that would be the exception and not the rule I am afraid. Thompson, Magnavox, Westinghouse, RCA, JVC, and LG. These name brands have been significantly hurt by a history of products of poor quality and/or poor reliability. That isn't to say everything they have made is bad. I particularly feel bad for JVC, as they have always excelled at innovation... just that they have not been good at implementing their designs, especially in manufacturing them.







HDMI is just DVI with audio. It's not, at least yet, the be-all-end-all connection. In fact, right now it has quite a significant drawback; signal strength and integrity over cables longer than 5m.
Wrong on all accounts. While HDMI and DVI are compatible, they are different, and in fact, HDMI is capable of longer runs than DVI. I've personally run 15m HDMI cable with no drop in signal strength, and I'm not even sure I know what loss of "integrity" you are referring to.

Bottom line: HDMI has a broader bandwidth than DVI, larger color depth than DVI, and can have longer cable runs than DVI.







That's where researching your purchase ahead of time comes into play. I made damn sure that my TV supported everything I wanted it to. It supports full 1080p over component, VGA, HDMI, and DVI.
Amen. However, your TV (Westinghouse LVM-47w1) has processing errors for 1080p via its HDMI input:
Audioholics Review
We tested 1080p source input to the various connections on the LVM-47w1. Like the 42-inch model, the HDMI input was apparently throttled down to only (correctly) accept 720p/1080i inputs. When we fed all three digital inputs 1080p source material the DVI-D inputs didn’t produce the sparkle artifacts, but the HDMI input did. We’ve blown up the sparkle effects below and though they were not present in every scene, they certainly indicate that this display is not fully 1080p compliant on the HDMI inputs.
It also pays to make sure your display does real deinterlacing and 3:2 pulldown conversion for interlaced signals and film content.

For instance, the Westinghouse LVM-47w1 only deinterlaces through the HDMI input. 1080i signals through the DVI, VGA, and component inputs are scaled from 1920x540 to 1920x1080. This is why many owners of this and the 42" model complain about the "fuzziness" on vertical resolution when comparing 1080i through those inputs to that of the HDMI input. On AVS, someone posted pixel mapping test data that showed the Westinghouses lost 50% of the vertical detail from 1080i sources through those inputs.

While I'm glad the LVM-47w1 worked out for you. Personally, I found its poor contrast (~ 450:1 calibrated ANSI), black level crush, noticable pixel structure, poor grayscale calibration tools, component overscan, and lack of any advanced scaling modes for SD to seriously detract from the quality of this display. Most decent in-depth reviews appear to share a similar opinion. Audioholics' highly respected HQV Bench Tests gave the LVM-47w1 a score of only 68 out of a possible 130. 👎

It failed several of the test requirements like:
  • Processor Jaggies #2
  • Motion adaptive Noise Reduction
  • Cadence 2:3:3:2 DV Cam
  • Cadence 3:2:3:2:2 Vari-speed
  • Cadence 5:5 Animation
  • Cadence 6:4 Animation
  • Cadence 8:7 animation
And only marginally passed the Jaggies #1, Scrolling Horizontal, and Flagging tests.

However, it is less expensive than the Sony Bravias and Sharp Aquos, and if you could negotiate a significantly lower price then it could be a good value choice that's for sure... assuming you get a trouble free unit.

What you're referring to is the ICT (Image Constraint Tag), that will reduce the resolution of HD signals down to 480p over any analog connection.
Actually it would be scaled down to 540p (1/4 HD).

Iceburns has it right.. HDMI is just a connection. It's a wire, nothing more. It's not some sort of super technology that's going to revolutionalize the video industry. The only reason it's so popular right now is because companies like Sony are touting that it's the only "true high definition" connection out there, and they're even restricting their own hardware to make it seem like the truth, by limiting 1080p video to HDMI-only.
WRONG. Please read up on HDMI before making such misleading comments. As mentioned earlier, HDMI is very different than component and VGA, and offers significant improvements.


(Whew... My fingers are tired.) ;)
 
Back