How we test HDTVs

Since we first began testing HDTVs several years ago, the technology of the sets has changed—mostly for the better. LCDs (including the pricier, more eco-friendly LED-backlit sets) are starting to dominate, even in larger sizes where plasma and rear-projection once ruled. Many sets come with sophisticated options for adjusting display formats and automating image settings for different lighting environments and content types.

Our testing philosophy has always been to evaluate products in a real-world setting, using them the way a typical person would, day in and day out. With HDTVs, we also strive to level the playing field by manually adjusting settings to industry standards for optimal picture quality.

We continue to evaluate HDTV image quality using juries of editors, writers, and lab analysts, using a test script the lab developed. Our latest test methodology reflects the changing landscape of HDTV content, technology, and usage.

For example, while we once tested HDTVs with DVI-V inputs, today we test only sets using industry-standard HDMI inputs (in fact, the sets that come in now don’t even support DVI-V). We test in a room with 5000K daylight-balanced fluorescent bulbs.

We place the sets side by side on long tables, with the midpoint roughly at eye level for someone sitting in a chair. Vendors say they design their sets to be viewed this way.

Calibration routine: Don’t try this at home

In calibrating the sets, we use only controls available to consumers (as opposed to the service controls available to professional calibrators on some sets). But we use equipment and software most consumers won’t have: a Sencore OTC1000-CM (Optical Tristimulus) Colorimeter and Sencore MP500 MediaPro Digital Audio/Video Generator and HDMI Analyzer (both connected to an HP notebook), Sencore ColorPro by CalMAN software, and transparent color films from the HD Digital Video Essentials consumer calibration kit.

We start by checking to see if the sets can properly support the screen resolutions used in our tests: 480p 60Hz, 720p 60Hz, 1080i 30Hz, and 1080p 60Hz. We then disable presets and all automatic image adjustment controls, and set display format controls to “dot-by-dot,” “just scan,” or the equivalent, with the goal of having the set display all pixels sent to it (as opposed to inducing overscan or underscan). We then perform an overscan test to see which sets cut off parts of the image. The goal of the actual calibration routine is to achieve a white balance that matches the ITU-R Recommendation BT.709 (commonly referred to as Rec.709) at D65 (6504K) while maintaining a luminance close to 40 foot-Lamberts. This is the standard by which most films and TV shows are produced, so if you want to see a film or TV show the way the producer meant it to be seen, you want your set to be calibrated to this standard.

We start by adjusting the backlight to match 40 foot-Lamberts, then use the set’s color temperature settings to achieve the closest match to the desired 6504K. After that, we fine tune brightness, contrast, color (saturation) and tint (hue), and (if available) the gamma setting (which corrects for the nonlinearity of luminance encoding).

We then adjust indvidiual color controls (most sets have red, green, and blue; some also have cyan, magenta, and yellow) to match as closely as possible their CIE chromaticity coordinates. (CIE stands for Commission Internationale de l’Eclairage—the International Commission on Illumination.)

We also calibrate for grayscale tracking, which adjusts the mixture of red, green, and blue that form grays to be as consistent as possible from dark to light. (The controls for this are called color gain and color offset.) The final refinement involves revisiting the brightness, contrast, color (saturation), and tint (hue) to correct variations from Rec.709, and also adjusting the sharpness control to achieve a sharp image without whitish artifacts.

Our jury tests

The sets are now ready for jury testing. All content is distributed to the sets using a Radiient HDMI 2:6 HDTV Distribution Amplifier. Each jury consists of five experienced testers, who rate each video clip on four separate attributes: brightness/contrast, colors/skin tones, sharpness/detail, and overall image quality. We use a 5-point rating scale, with half points allowed. We typically test five or six sets at a time. To ensure consistency across test batches, we include a previously tested baseline HDTV of the same size as the new HDTVs in each test batch.

Content for the over-the-air tests was recorded from various antenna sources using a MyHD PCI card with a DVI-HD daughtercard on a PC test bed, and played back on a Western Digital WDTV HD media streamer set to 1080i or 720p as appropriate. We use four clips: a 720p clip from a baseball game, a 720p clip from the Wheel of Fortune TV show, a 1080i clip from a football game, and a 1080i clip from a show about a vineyard that includes several scenic panoramas of different landscapes.

We use two scenes from the Phantom of the Opera DVD to test how well the sets handle 480p video content. Since most HDTVs now support 1080p content (so-called “full HD”), we also evaluate several 1080p Blu-ray clips: a scene from Mission: Impossible III (where a brick wall is prone to moire and other artifacts), and two scenes from The Dark Knight, which test the set’s ability to display detail (especially in a night-time scene) and motion. In addition to the video tests, we evaluate a couple of files created to test HDTV refresh rates and show the difference between 60Hz, 120Hz, and 240Hz HDTVs in handling fast-moving videos. One is a horizontal pan of a blue print conducted at 60Hz (interlaced); the other is a diagonal pan of a city performed at 24Hz (progressive). We also evaluate a still life image, comparing the set’s reproduction to the actual objects. Here we’re checking the set’s sharpness and color fidelity.

Finally, we use two synthetic tests from the popular HD HQV benchmark. A video resolution loss test reveals which sets are prone to incomplete rendering of finely detailed images, and a “jaggies” test shows how well a set can alias (smooth out) moving diagonal strips.

Results of the subjective tests are averaged to produce final rankings.

In addition to image quality, we also test power consumption using the Watts Up! Pro power meter. We record, first, average power consumption with the set powered down (but still connected to an outlet), then total consumption while running the Mission: Impossible III clip for five minutes. We run the test after the sets have been calibrated but before the jury testing.

Subscribe to the Best of Macworld Newsletter

Comments