1080i vs 720p


If one has a 720p projector which can output at 1080i, is it better when using regular dvd source or HDTV to watch in 1080i or use my Faroudja processor and watch in 720p, technically speaking that is.
jcbower
Elizabeth:

The picture quality generally is better in 720p.

Prpixel:

I have to agree with Elizabeth; 1080i looks better than 720p.

???????
It depends on what kind of material you're watching also. The real advantage to progressive scan (the "p" in 720p) is that the integrity of the image is basically maintained even when there are periods of fast action. When frames are displayed as interlaced (the "i" in 1080i), you'll get a combing effect during action scenes. This is because on one refresh of the screen you'll get lines 1, 3, 5, 7, 9, etc, and on the next refresh you'll get lines 2, 4, 6, 8, etc. With progressive you'll get the lines of information in sequential order, painted along the top of the screen to the bottom, so at the very worst you would see only one visual rift on the screen (for example when the new set of lines 1 to 325 meet up with the old lines 326 to 720, you would see a rift between line 325 and 326.)

I personally use 720p even though my TV maxes out at 1080i. If your TV is under 50", you won't even be able to tell a difference between 720p and 1080p from a normal viewing distance. There is a great article about this on CNET. If you're interested, I'll pass along the link.

-Dusty

Only CRTs can display interlaced signals. All digital displays (plasma, LCD, LCD projectors, etc...) are progressive. As mentioned above, a 720p projector always outputs 720p, just as a 1080p plasma always displays 1080p. The is no such thing as a 480i or 1080i digital display. Of course these devices can accept interlaced signals-- standard def TV (480i), and high-def TV (1080i).

However in these cases, the display has to de-interlace the signal to display it progressively. The quality of de-interlacing can be variable. So perhaps 1080i on a display that does a relatively poor job of de-interlacing might look noticeably worse than a 720p source on the same display.

With most newer displays, the consensus seems to be that 720p and 1080i sources look very similar.

So to the OP, first of all, your processor should be set to output 720p, otherwise it will be sending a signal that your PJ needs to process additionally.

720p sources (some HD channels) need no processing whatsoever to display on your PJ. These sources should be sent 720p native from the cable box, pass through your processor without being changed, to your PJ which will display them natively.

480p from standard DVD doesn't have to be de-interlaced, but needs to be scaled to 720p for your PJ. You need to figure out which of your player, processor, or PJ does this better.

As to 1080i from your cable box, this needs to be both de-interlaced and scaled down to 720p for display by your PJ. Most likely either your processor or PJ will do this better than the cable box, and you need to figure which one actually results in a better picture.

If your PJ is significantly newer than your processor, it is possible that it does all of these things better, and you might be able to ditch your processor.

Hope this helps

dave
Dg1968

So what does it mean that my LCD is 1080i if it doesn't actually display 1080i? This is an 8 year old rear projection Sony LCD. It doesn't display 1080p only 1080i. If "All digital displays (plasma, LCD, LCD projectors, etc...) are progressive" then it would be able to display 1080i and 1080p, now wouldn't it?
Dg1968,

Dave, very good explanation. It all comes down to which device in the chain has the best scaler.

Linkster,

Yes, 1080i broadcast generally look better to me than 720P. They seam to have more defined edge detail, IE they look sharper and therefore have more depth of field.

Dusty,

I have to disagree with the statement that there's no difference between 720P and 1080P on 50" and smaller screens at normal viewing distance. In side by side comparisons even my wife could detect a difference on screens as small as 32". The difference is most detectable in edge detail and with text. I'm curious if I'm going to be able to see a difference on 23" panels. I'm dying to replace the 23" 720P LCD in the kitchen with a 1080P IPS panel. I sit less than 3' from this set while eating.
Recently, I was in Best Buy (not the best lighting conditions) and they has a 42" 720P Panny Plasma sitting right next to a 42" 1080P Panny Plasma. I calibrated them as close a I possible could and then started asking strangers if they could see a difference between the two sets. Six out of seven people I asked could see a difference, and then my wife dragged my out of the store by my ear. The source was a 1080P demo loop with scenes from Avatar.

My video system consist of a TivoHD, set to output native broadcast format, running into an Anthem Statement D2v processor. My project is a Mitsu HC6800 and my screen is a 120" Da-lite High Contrast Cinema Vision. The projector has been ISF calibrated. The Sigma Designs VXP broadcast-quality video processor, in the Antehm, takes care of the scaling duties. In addition, I also have a Pioneer Blu-ray player. In the Bedroom it's a TivoHD directly into a 42" Panny 1080P Plasma. The 42" Plasma goes away next Wednesday and will be replaced with a 50" Panny Plasma (TC-P50G25) based on the new infinite black panel.