1080i vs 720p


If one has a 720p projector which can output at 1080i, is it better when using regular dvd source or HDTV to watch in 1080i or use my Faroudja processor and watch in 720p, technically speaking that is.
jcbower
I would think 720 P theoretically, than, should equal 1440i, but it doesn't. Inherent limitations of bandwith, and time, if I am not mistaken?
I don't agree that "no flat panel displays 1080i...". Mine does. Precisely. As for my post, I don't agree that a progressively scanned signal is half of the intelaced. Resolution isn't that simple. To me 720 P is a knock off of 1080i (due to transmitor not wanting to pay for the bandwidth and, perhaps, actually thinking a 720 progressively scanned image is better than a 1080 scanned interlaced image-fox network comes to mind); until 1080P, 1080i was the puresst hd resolution. I am not dismissing the advantage of a progressively scanned image, just the size of bandwith it travels in.
Cerrot - HDTV programs change format many times being transfered between stations. Standard OTA HDTV signal is digital and compressed to fit 6MHz bandwidth slots. No matter what this signal is my DLP TV always converts it to own TV format where individual pixels of the same color are addressed at once - then next color comes etc. 90 times a second (like color interlacing). Even frequency of update - 90Hz suggest that it has nothing to do with 60Hz interlacing. Other displays like plasma or LCD are perhaps simpler and address each individual pixel - whole screen at once. Screen resolution and update rate are arbitrary and can be rescaled. New Samsung LCD TVs rescale update rate (and approximate pixels) to 120Hz providing more fluid motion.

My TV has 720x1280 pixels resolution. With higher resolution amount of detail would be the same since it is limited by the bandwidth.
Post removed