In the world of Televisions today, there is pretty much only one format being sold in large numbers any longer: the high-definition television. Although the mechanism of delivery can vary (Plasma, LCD, LED), all high definition TVs come in the same dimension (16 across by 9 high, proportionally, aka 16×9) and they have one of three resolutions: 720p, 1080i, and 1080p. Frequently, stores trying to sell you their “bigger and better” models will tell you that only 1080p is “true HD”, but actually any TV definition greater than the standard broadcast definition of 20th century television is considered high definition.
What exactly is meant, though, by “high definition” vs. “standard definition”? Well, the meaning can vary in different countries, because each region has its own standards. Since I’m American, I will focus on the United States. In the U.S., the NTSC standard was used until last year, and that standard is what we call “Standard Definition.” Regular old “tube” TVs, such as the ones you would have seen everywhere in the 1970s and 1980s, the “standard def” TVs (SDTVs), had a sort of “gun” that fired a fraction of the image at the screen. The gun started at the top left of the screen, and painted (very quickly, far too quickly for you to see it with the naked eye) a line of the image about 1/480th the size of the screen, across the screen from left to right. When it got to the end of the screen, the gun dropped down one line, and painted from left to right again. The gun was fast — all 480 lines were painted in the screen in just under 1/30th of a second, and this is what we call one “frame” of video. The gun would then go back up to the top, and start again… painting all 480 lines on the screen about 29.9 times a second. If you get really close to an SDTV, so that your face is almost touching it, you can see these lines on the screen.
To increase the resolution or “definition” of the image, one would have to increase the number of lines. Smaller lines mean better resolution, but at first the technology to do that was hard to develop. As one way to fudge it, the idea of “interlacing” was created. With interlacing, the TV paints only half of the image each frame. It paints every even line first, and then every odd line second. When played back fast enough that your eye can’t pick up the difference between the two “half frames,” the effect is a better apparent resolution (to the human viewer). But if you were to pause or slow down the video, you would see artifacts and distortions. Artifacts can also appear at real-time speeds if there is a lot of fast motion in the video field — because the two half-frames are separated enough in time that the image has moved slightly, causing the odd and even frames to be “off” by a fraction.
This then gives us our two basic formats. Progressive scan projects the full image resolution, say 480, all at once in the frame. So a 480p set shows you 480 lines from top to bottom, from lines 1-480, every 1/29.9 seconds. Interlaced projects half the resolution per frame, but usually does it twice as fast, so a 480i set would show you lines 1, 3, 5, 7, 9, … 479 during the first 1/60th of a second, and then lines 2, 4, 6, 8, 10, … 480 during the second 1/60th second. In practice you are getting 480 lines in about 1/30th of a second either way, but they are being delivered in slightly different fashions.
I’m not going to spend any more time discussing the interlaced formats. What I’m going to say for the progressive scan could be said for interlaced as well, so I will leave it to the reader to figure that part out.
Now back to HDTV and what resolution one really needs. Both 720p and 1080p sets paint way, way more lines of image on the screen than the old 480p TV sets of the 20th century. They are both considered HDTV. But because 1080p sets are printing a lot more image detail than 720p, and because they have more pixels per screen area, they are far more expensive. Is the added expense worth it?
The answer, as you might expect, is “it depends.” There are a lot of variables, and part of what will determine it is how close you sit to the TV. If you sit right on top of the TV, larger pixels are easier to notice. Sit back a few feet, and they fade into a smooth image. Assuming you are sitting across the room from your TV, however, the main thing that will matter is going to be screen size. So how do you figure out whether it’s worth it? My opinion (and it’s nothing more than an opinion) is that you want 1080p for anything over 37″, but that 1080p is not worth it for 32″ and smaller sets. At 37″ it’s kind of a toss-up, but I’d probably go for 1080p at that size too. Now, am I just pulling numbers out of thin air? Well, no… there’s a reason.
Let’s first talk about the total resolution. The sets have a number of lines from top to bottom, but they are also generating the image in small blocks (called pixels) on each line. A typical 720p HDTV set, for instance, has 1280 blocks of image on each of the 720 lines — for a total of about 921,000 total pixels on the TV (compare with a 480p TV with 640 blocks per line from 20 years ago, which had only 290,000 pixels — the 720p TV has 3x as much resolution!). Meanwhile, a typical 1080p TV has 1920 pixels per line, or about 2 million pixels on the entire screen — almost 10x the resolution of an SDTV! Wow, that’s a lot of resolution, so why not go for the max resolution even with a small TV? My answer is — you don’t really need it.
Let’s think about the larger TVs first. Take a typical large-screen HDTV, like a plasma TV. This will be a 1080p set that is say 48″ in size (in TVs that means 48″ on the diagonal). A 48″ TV that’s 16×9 is going to be 42 inches across x 24″ high, for a total screen area of 1,008 sq. inches. Since this area has 2 million pixels, that means that 48″ plasma that looks so crystal clear is going to have about 1,984 pixels per square inch. Another way of saying it is that each pixel is about 1/44th of an inch.
This is your key value. If you think that 48″ plasma TV with 1080p resolution looks good (and most people do — I sure do), and you want an image that clear, then you need a screen that gives you pixels 1/44th of an inch across. The reason is that resolution is about the size of the pixels, not how many (total) there are in an image. In other words, for a given screen size, more pixels is always better (because it means they are smaller), but for different screen sizes, although more is better, sometimes you don’t need more.
Let’s take a look at my cutoff size, 32″. My 32″ LCD HDTV is 720p, not 1080p. Poor Chessack, right? He has half resolution! Well, actually… I don’t. A 32″ TV that is 16×9 will measure about 28″ across and 16″ high, and is therefore 448 sq. inches in area. Since it’s a 720p set, it has 921,000 pixels in 448 sq. inches, which amounts to 2055 pixels per square inch. Look back up at the number of pixels the 48″ TV has. Notice anything? That’s right… there are actually a few more pixels/square inch in my 32″ 720p TV than in a 48″ 1080p! Indeed, each pixel on my TV is 1/45th of an inch across, vs. 1/44th of an inch across on the 48″ 1080p set. Therefore, my TV’s resolution works out, in practical terms, to the human eye, to be just as good as a 48″ 1080p plasma set — because the pixels are about the same size, and it is pixel size that determines image crispness (aka. resolution).
That is why I say that at 32″ and below, you don’t need 1080p. The image on a 32″ set will look exactly as crisp at 720p as it does on a 48″ set at 1080p — and 720p sets are much cheaper. In my view, 32″ TVs with 1080p resolution are a rip-off. The stores (and manufacturers) only make them because they know people want “the best” and can be fooled by the larger-sounding number. Note that I’m not saying a 1080p 32″ set wouldn’t have better resolution than a 720p — it would, by a lot. But your eye probably wouldn’t be able to see it unless you sat with your nose pressed against the TV set, and the 720p set’s resolution is just as good as a 48″ TV’s resolution would be at 1080p. If 1080p is an acceptable resolution (with 1/44th inch pixels) for a giant set, then why isn’t 720p resolution with similar 1/45th inch pixels acceptable in a smaller set? In my view, it is.
Before I close, a quick word about 37″ TVs. You can probably get away with 720p there, though at that size my eye starts seeing a difference between 720p and 1080p. A 37″ set will be about 576 sq. in, giving pixels about 1/40th of an inch across. Intellectually I feel like my eye probably shouldn’t be able to see a difference between that size and 1/45th inch pixels, but I feel like I can start to see the difference here. Maybe it’s just my mind playing tricks on me. At any rate, by the time you get to 37″ and above, there will start to be a visible difference in image crispness between 720p and 1080p. Below that (32″ and less), most people just will not see a difference.
And so, my recommendation is summed up like this: At 32″ and below, 1080p is absolutely not necessary, and I think people who buy 1080 sets in this size range are being taken to the cleaners. At 42″ and up, 1080p is probably necessary for a true, crisp, HD-looking image (by the time you get to 42″ sets, a 720p image would have pixels 1/34th of an inch across or fully 33% larger than on a 32″ set). At 37″ you are straddling the fence. 720p is probably OK, but you may notice a boost from 1080p. I’d probably go for 1080p at this size… but not below it.