TV is rooted in the same concept as movies: Capture and display a sequence of still images fast enough, and the eye perceives smooth motion rather than a succession of individual pictures. Historically, however, TV has handled this process somewhat differently from movies. A TV image, or frame, is a grid of individual picture elements (pixels), arranged in rows and columns. But in analog broadcast systems, such as our venerable NTSC, each frame is split into two fields - the first comprising the odd-numbered pixel rows, the second the even-numbered rows. The fields are transmitted sequentially, and in CRT-based TVs they're displayed sequentially as well, so you get the odd rows from the first frame, then the even rows from that frame, then the odd rows from the second frame, and so on. This technique, called interlacing, was designed to reduce transmission bandwidth with little or no loss of picture resolution. And it works pretty well for CRT (picture tube) displays.
A couple of things have changed in recent years, though. One is the dramatic shift away from CRTs to so-called "fixed-pixel" displays - plasma, LCD, DLP, and LCoS. Unlike CRTs, which excel in flexibility, fixed-pixel displays are naturally progressive, meaning they normally need to operate frame-by-frame. The idea of alternating fields, odd pixel rows followed by even followed by odd, is foreign to them. The other is that our old analog TV broadcast system is giving way to digital HDTV, which is transmitted in one of two resolutions, 720p or 1080i. The former uses frames that are 1,280 pixels wide and 720 pixels high transmitted progressively, with 60 complete frames per second. The latter is an interlaced format employing 1,920- by 1,080-pixel frames, each split into two 1,920 x 540 fields; transmission is at 60 fields (30 frames) per second.
A 720p signal requires relatively little processing by a fixed-pixel display. It will have to be scaled (converted) to the display's native resolution if it is something other than 1,280 x 720, but that's it. A 1080i signal, on the other hand, will always have to be deinterlaced to 1080p, then scaled if the display's native resolution is not 1,920 x 1,080. All this is routine in today's HDTVs, however, and given that 1080i displays are virtually extinct and 1080p models increasingly commonplace, why the hubbub surrounding 1080i vs. 1080p?
What started it was the introduction of HD DVDs and Blu-ray Discs, which are mastered in 1080p format. The ideal display for such a signal is one that can accept 1080p and put it on the screen with no further processing. Even some 1080p HDTVs won't take a 1080p signal, though, topping out instead at 1080i input. In fact, if you see an HDTV advertised as 1080i, chances are what's meant is that it can accept only up to 1080i signals, not that it actually is a 1080i display. If you have a 1080p HDTV that chokes on 1080p signals, you'll need to set your HD DVD or Blu-ray Disc player to generate interlaced (1080i) output and rely on the TV to deinterlace it back to 1080p. (If you have a fixed-pixel display with any other resolution than 1080p, you probably would want to set the player for 720p output.)
As already noted, deinterlacing is a routine procedure in HDTVs, so that might not seem like a big deal - except that not all deinterlacers are created equal. Some are better than others, especially in handling video originated from 1080i video cameras, because then the two fields making up each frame are captured a sixtieth of a second apart. Sophisticated motion-compensation algorithms are necessary to prevent softening of the image or introduction of annoying visual artifacts, such as "jaggies."
The good news is that deinterlacing 1080i signals from an HD DVD or Blu-ray Disc player is usually much simpler. Almost all discs contain movies shot on film (or progressive-scan digital video cameras) at just 24 frames per second. Even when interlaced to 1080i, the signal has to be padded to get 60 fields per second - that's what 3:2 pulldown is all about. A deinterlacer that accurately detects and compensates for 3:2 pulldown can weave the two fields derived from each original film frame back together perfectly, yielding a picture identical to what you'd see if the display could accept 1080p directly. (The signature artifact of poor 3:2 pulldown compensation is "combing," which occurs when fields from two different film frames are mistakenly combined.) And though HDTVs vary in their film-mode deinterlacing performance, most current models do an excellent job.
Bottom line: It usually doesn't matter whether an HDTV can accept 1080p video signals. Most sets can take 1080i signals generated from the only available 1080p sources - HD DVDs and Blu-ray Discs - and deinterlace them perfectly back to their original 1080p format. More often than not, it doesn't even matter if they wind up displayed in 1080p. Unless you're sitting close to a large screen (less than 10 feet from a 50-inch display, for example), you won't be able to see the difference between 1080p and 720p (or similar resolutions, such as 1,366 x 768 or 1,024 x 768).
For a more complete discussion of deinterlacing and scaling, see Video Upconversion: Facts and Fallcies.
Copyright © 2013 Bonnier Corp. All rights reserved. Reproduction in whole or in part without permission is prohibited.