720 p/1080 p combatibility.....


How do you know if a 720p set accepts a 1080p signal?...manual says 1080i but no mention of 1080p format...set has hmi connections...sony kds 50e2000
128x128phasecorrect
Nope... Unless somethings changed recently 1080i is as good as it gets on Satellite. The Sony will most likely accept 1080P ..it will just down convert it to 720P. So basically you'll get a good picture..it want be the full resolution however. You're missing more than a million pixels that are needed to display Blu ray at its finest.
Dish and Direct dont even have real HD in the numbers they love to brag about...see for yourself, look at HGTV, TNT, FOODNET and too many more to list and notice on the edges where black bars would be the image is stretched to fill screen and normal in the 4.3 area, its almost fraud to tout so much HD content when its just digital stretch and force justified to screen maybe even upsampled but its enhanced SD (Standard Def) not true HD by a long shot.
Sorry. My bad…

1080 is 1080. regardless, in terms of display content ratios. The interlaced version scans the image twice laying in lines of resolution, first odd then the even lines. Progressive, goes sequentially. 1,2,3, 1000, etc., making for a smoother displayed image. Both formats do this 30 times a second.

My old 4:3 TV must have been scaling the info to the 4:3 layout and appeared to me as no different. On 720 though it was more prominent as a wide screen 16:9 image.

From:
http://reviews.cnet.com/720p-vs-1080p-hdtv/?tag=rb_content;rb_mtx

12-5-07

1. What's so great about 1080p?
1080p resolution--which equates to 1,920x1,080 pixels--is the current Holy Grail of HDTV resolution. That's because most 1080p HDTVs are capable of displaying every pixel of the highest-resolution HD broadcasts. They offer more than twice the resolution of step-down models, which are typically 1,366x768, 1,280x720, or 1,024x768. These days, HDTVs with any of those three of lower resolutions are typically called "720p." Nobody wants to remember all those numbers, and "768p" doesn't really roll off the tongue.

…

3. Why is 1080p theoretically better than 1080i?
1080i, the former king of the HDTV hill, actually boasts an identical 1,920x1,080 resolution but conveys the images in an interlaced format (the i in 1080i). In a tube-based television, otherwise known as a CRT, 1080i sources get "painted" on the screen sequentially: the odd-numbered lines of resolution appear on your screen first, followed by the even-numbered lines--all within 1/30 of a second. Progressive-scan formats such as 480p, 720p, and 1080p convey all of the lines of resolution sequentially in a single pass, which makes for a smoother, cleaner image, especially with sports and other motion-intensive content. .

4. What content is available in 1080p?
Today's high-def broadcasts are done in either 1080i or 720p, and there's little or no chance they'll jump to 1080p any time soon because of bandwidth issues. Even the much-vaunted high-def games on the Xbox 360 and the PlayStation 3 are usually 720p native (if not less), though they can be upscaled to 1080i or 1080p in the user settings of those consoles. Really, the only commercially available way to get true 1080p output--aside from hooking your PC to your HDTV--is to get a Blu-ray or HD DVD player. All Blu-ray players and some high-end HD DVD models support 1080p output, and--more importantly--the vast majority of discs are natively encoded at 1080p. .

5. What kinds of TV technologies offer 1080p resolution?
These days, everything but CRT (tube) TVs comes in 1080p versions. That means you can find 1080p-capable versions utilizing all fixed-pixel technologies, including microdisplays (DLP, LCoS, and LCD rear-projection/front-projection) and flat-panels (plasma and LCD). Of course, as specified above, more affordable entry-level models are still limited to 720p resolution. But whatever the resolution, all fixed-pixel (non-CRT) TVs are essentially progressive-scan technologies, so when the incoming source is interlaced (1080i, or even good old-fashioned 480i standard-definition), they convert it to progressive-scan for display. . At this point, I could just expand on that last point and specify that all fixed-pixel display TVs--all microdisplay rear-projection and all flat-panels--always display everything at their native resolution, which is all they can display. On a 720p TV, that means that all incoming video is displayed at 720p (or 768p, as the case may be); on a 1080p TV, all incoming video is displayed at 1080p. The process of converting resolution is called scaling--sometimes called upconverting or downconverting. A related factor is deinterlacing (see point no. 8, below). How well a TV does or does not handle both of these processes is a big factor in how desirable it is--and something that casual shoppers often overlook, since, compared to the screen size or resolution, it's not as easy to show as a spec sheet bullet point.

I should probably put that whole previous paragraph in bold, though, because the message never seems to get through. So, at the risk of overkill, let's restate it with specific resolutions:

6. What happens when you feed a 1080i signal to a 720p TV?
The 1080i signal is scaled, or downconverted, to 720p. Nearly all recent HDTVs are able to do this.

7. What happens when you feed a 1080p signal to 720p TV?
Assuming the TV can accept a 1080p signal, it will be scaled to 720p. But that caveat is important: many older 720p HDTVs--and yes, even some older 1080p models--cannot even accept 1080p signals at all, in which case you'll get a blank screen. Thankfully, most newer HDTVs can accept 1080p signals.

8. What happens when you feed a 1080i signal to a 1080p TV?
It's converted to 1080p with no resolution conversion. Instead, the 1080i signal is "de-interlaced" for display in 1080p. Some HDTVs do a better job of this de-interlacing process than others, but usually the artifacts caused by improper de-interlacing are difficult for most viewers to spot.

9. Side by side, how do 720p and 1080p TVs match up in head-to-head tests?
We spend a lot of time looking at a variety of source material on a variety of TVs in our video lab here at CNET's offices in New York. When I wrote my original article two years ago, many 1080p TVs weren't as sharp as they claimed to be on paper. By that, I mean a lot of older 1080p sets couldn't necessarily display all 2 million-plus pixels in the real world--technically, speaking, they couldn't "resolve" every line of a 1080i or 1080p test pattern.

That's changed in the last couple of years. Most 1080p sets are now capable of fully resolving 1080i and 1080p material. But that hasn't altered our views about 1080p TVs. We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. On a regular basis in our HDTV reviews, we put 720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray and HD DVD players. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV.

I said so much in a 2006 column I wrote called The case against 1080p, but some readers knocked us for not looking at high-end TVs in our tests. But the fact is, resolution is resolution, and whether you're looking at a Sony or a Westinghouse, 1080p resolution--which relates to picture sharpness--is the same and is a separate issue from black levels and color accuracy.

Our resident video guru, Senior Editor David Katzmaier, stands by what he said two years ago: The extra sharpness afforded by the 1080p televisions he's seen is noticeable only when watching 1080i or 1080p sources on a larger screens, say 55 inches and bigger, or with projectors that display a wall-size picture. Katzmaier also says that the main real-world advantage of 1080p is not the extra sharpness you'll be seeing, but instead, the smaller, more densely packed pixels. In other words, you can sit closer to a 1080p television and not notice any pixel structure, such as stair-stepping along diagonal lines, or the screen door effect (where you can actually see the space between the pixels). This advantage applies regardless of the quality of the source.

10. OK, so what's the bottom line: Should I go 1080p or 720p?
First and foremost, some people just want what's considered the best spec on a TV. If you're one of those people, spend the extra dough, you'll feel better in the long run. Secondly, if you're thinking of going big, really big (a 55-inch or larger screen), or you like to sit really close (closer than 1.5 times the diagonal measurement), the extra resolution may make it worth the difference--as long as you have a pristine, 1080i or 1080p HD source to feed into the set. And finally, it's a good idea to go with 1080p if you plan to use your TV a lot as a big computer monitor. That said, if you set your computer to output at 1,920x1,080, you may find that the icons and text on the screen are too small to view from far away (as a result, you may end up zooming the desktop or even changing to a lower resolution). But a 1080p set does give you some added flexibility (and sharpness) when it comes to computer connectivity.

If none of those factors jump out at you as true priorities--and you are working on a tight budget and want to save some dough--a 720p set is going to do you just fine. HD will still look great on your set, I swear. In fact, our current highest-scoring HDTV, the Pioneer Kuro PDP-5080HD, is a 720p, er--768p, model.

11. Wait! What about 120Hz LCDs and how they compare to 720p/1080p plasmas?
This column's just about 720p vs.1080p. If you're interested in 120Hz, try Six things you need to know about 120Hz LCD TVs.

Hope that helps..