1080i vs 720p


If one has a 720p projector which can output at 1080i, is it better when using regular dvd source or HDTV to watch in 1080i or use my Faroudja processor and watch in 720p, technically speaking that is.
jcbower
Post removed 
Do you mean that your 720P projector can accept 1080i input? I've never heard of a 1080i projector. All input on your 720p projector will be scaled to 720p no matter what the input resolution is.

I have to agree with Elizabeth; 1080i looks better than 720p.
Elizabeth:

The picture quality generally is better in 720p.

Prpixel:

I have to agree with Elizabeth; 1080i looks better than 720p.

???????
It depends on what kind of material you're watching also. The real advantage to progressive scan (the "p" in 720p) is that the integrity of the image is basically maintained even when there are periods of fast action. When frames are displayed as interlaced (the "i" in 1080i), you'll get a combing effect during action scenes. This is because on one refresh of the screen you'll get lines 1, 3, 5, 7, 9, etc, and on the next refresh you'll get lines 2, 4, 6, 8, etc. With progressive you'll get the lines of information in sequential order, painted along the top of the screen to the bottom, so at the very worst you would see only one visual rift on the screen (for example when the new set of lines 1 to 325 meet up with the old lines 326 to 720, you would see a rift between line 325 and 326.)

I personally use 720p even though my TV maxes out at 1080i. If your TV is under 50", you won't even be able to tell a difference between 720p and 1080p from a normal viewing distance. There is a great article about this on CNET. If you're interested, I'll pass along the link.

-Dusty

Only CRTs can display interlaced signals. All digital displays (plasma, LCD, LCD projectors, etc...) are progressive. As mentioned above, a 720p projector always outputs 720p, just as a 1080p plasma always displays 1080p. The is no such thing as a 480i or 1080i digital display. Of course these devices can accept interlaced signals-- standard def TV (480i), and high-def TV (1080i).

However in these cases, the display has to de-interlace the signal to display it progressively. The quality of de-interlacing can be variable. So perhaps 1080i on a display that does a relatively poor job of de-interlacing might look noticeably worse than a 720p source on the same display.

With most newer displays, the consensus seems to be that 720p and 1080i sources look very similar.

So to the OP, first of all, your processor should be set to output 720p, otherwise it will be sending a signal that your PJ needs to process additionally.

720p sources (some HD channels) need no processing whatsoever to display on your PJ. These sources should be sent 720p native from the cable box, pass through your processor without being changed, to your PJ which will display them natively.

480p from standard DVD doesn't have to be de-interlaced, but needs to be scaled to 720p for your PJ. You need to figure out which of your player, processor, or PJ does this better.

As to 1080i from your cable box, this needs to be both de-interlaced and scaled down to 720p for display by your PJ. Most likely either your processor or PJ will do this better than the cable box, and you need to figure which one actually results in a better picture.

If your PJ is significantly newer than your processor, it is possible that it does all of these things better, and you might be able to ditch your processor.

Hope this helps

dave