Measuring Motion Blur in Liquid-Crystal Displays

Motion blur is one of the biggest problems facing LCD designers today. Here, an experimentally simple method to determine motion blur in LCDs is proposed, based on a luminance data-acquisition system that accurately measures the temporal step response of various luminance transitions on the display.

by Kees Teunissen, Xiaohua Li, and Ingrid Heynderickx

ALTHOUGH liquid-crystal displays (LCDs) are being increasingly used for television applications thanks to their reduced weight and thickness (enabling larger displays and more-appealing designs), studies have shown that the picture quality of cathode-ray-tube (CRT) TVs is still preferred compared to that of an LCD TV.1 One possible reason for the lesser performance of LCD TVs may be related to the temporal behavior. When an object moves across an LCD screen, it becomes blurred for two reasons: (1) the relatively slow LC response time (LC-RT), i.e.,the time needed to change from one luminance level to another, and (2) the sample-and-hold effect, i.e., the fact that a pixel maintains a stable luminance value for an entire frame time, as opposed to that of a CRT where a pixel is activated only for a fraction of a frame time.

The LC-RT used in LCD-panel specifications is often suggested to be representative of the dynamic performance of the LCD. But it is actually the sample-and-hold effect that becomes the dominant factor for motion blur when the LC-RT is sufficiently short. Indeed, it is a combination of two factors that clearly results in perceived motion blur (see Fig. 1) for sufficiently high speeds (> 4 pixels/frame) and relatively long frame periods (> 15 msec):


Kees Teunissen is Program Manager for Displays at Philips Consumer Electronics, Innovation Laboratory, Bldg. SFJ 5.22, Glaslaan 2, 5616 LW, Eindhoven, The Netherlands; telephone +31-40-27-33114, fax -34482, e-mail: Kees.Teunissen@philips. com. Xiaohua Liis a professor at Southeast University, Nanjing, China. Ingrid Heynderickx is a Research Fellow in the Visual Experience Group at Philips Research Laboratories and is a professor at the Delft University of Technology, Delft, The Netherlands.

p20_fig1_tif

Fig. 1: Image blur during smooth-pursuit tracking of an image moving to the right on an LCD with a continuous backlight. The left side of the figure shows the image on the display and the eye at four time points during one frame. The four cars at the top are the stationary image on the display during the one frame while the eye is in smooth pursuit. The four cars at the bottom of figure show how the image on the retina blurs as the eye rotates during pursuit. The right side of the figure shows the beginning of the next frame as the car jumps to a new location on the screen.

(1) an object that is assumed to move continuously across the display, but actually stays stable during a whole frame time and then jumps to another location on the screen for the succeeding frame, and (2) the eye tracking of the expected continuous motion of the object.

Two measuring techniques have been introduced to characterize perceived motion blur in LCDs. The first directly measures the edge blur with a smooth-pursuit camera system.2 The accuracy of this method, however, largely depends on a rotating camera or a rotating mirror to accurately track the motion and fix the image on the camera sensor. The second technique measures the temporal luminance transitions and calculates the edge blur assuming smooth-pursuit eye tracking and temporal integration.3,4 The disadvantage of this method is that a conversion from the temporal domain to the spatial domain is needed in order to determine the edge blur. However, the second system is much easier to build and can also be used to characterize other motion artifacts, such as dynamic false contours in plasma displays.3 Hence, the second method is preferred because it is simpler, less expensive to implement, and more flexible.

Measurement System

A schematic representation of the equipment used to measure the temporal luminance transitions (step responses) is shown in Fig. 2. A fast-response photodiode captures the temporal luminance variations at a fixed position on the display. Its output is connected to the data-acquisition system via additional circuitry for signal amplification and low-pass filtering. To obtain luminance values from the photodiode, it is important that the spectral sensitivity of the sensor is matched to the spectral luminous-efficiency function V(λ) for photopic vision. A specially designed programmable video-pattern generator is used to produce the required test patterns and send stable synchronized triggering signals to the data-acquisition system. The luminance meter in the system allows for converting the voltage values to luminance values during the initial calibration. A computer is used to control the entire measurement system.

Modeling LCD Motion Blur

The model for perceived motion artifacts essentially starts from the accurate measurement of the temporal step responses, and adds the assumptions of smooth-pursuit eye tracking and temporal integration in the visual system.5 For example, when a white block moves across a dark background at a given speed, the perceived blur of the front edge can be calculated from the light-intensity profile V0 as perceived by the human eye using the following equations:

eq_1_eps

eq_2_eps

where xi is the position projected on the retina; j is an integration index; Tf is the frame period; v is the motion speed of the edge; Y is the measured temporal luminance with Y0 and YF the initial and final luminance, respectively; and Y1Y2, and Y3 are the sampled luminance data, extending up to three frame times. The blurred-edge width (BEW) is then calculated from V0(xi) as the distance (in pixels) between a luminance of 10% and 90% for the rising edge (head) and between 90% and 10% for the falling edge (tail).

With this measurement system, several luminance transitions (Yi) are measured, and from these, the BEW at different motion speeds is calculated. The head and tail blured-edge values determined by BEW vs. motion speed, with the transition level as a parameter, is shown in Fig. 3. It illustrates that the edge blur is smallest for the black-to-white transition (Y0®Y6), that it increases with increasing motion speed, and that it is different for rising and falling edges.

Perceived LCD Motion Blur

How well the calculated edge blur corresponds to the motion blur that is actually perceived is validated by a perception experiment. In this experiment, blurred images were simulated for a limited number of transition levels and speeds. The transition levels selected were those shown in Fig. 3 and consisted of five transitions corresponding to a bright block moving on a dark background and three transitions corresponding to a dark block moving on a bright background (Y6®Y0Y5®Y1, and Y4®Y2). Four speeds were used: 4, 8, 12, and 16 pixels/frame (the latter one corresponds to 29°/sec). The simulated still image was presented on the top half of an LCD screen, while a moving, intrinsically sharp image was presented on the bot-

p21_fig2_tif

Fig. 2: Schematic representation of the system to measure the temporal step responses at a fixed location on the display under test.

 

tom half. This moving image was shown in a loop, starting from behind a vertical bar, traversing the window from left to right with a user adjustable speed, and disappearing behind the other vertical bar (see Fig. 4).

For each of the 32 stimuli (i.e., 8 transitions x 4 speeds), 17 observers (between 22 and 48 years old) were asked to adjust the speed of the moving block such that its appearance corresponded best to that of the simulated still image. For that particular speed, the observer had to score the match in blur between the simulated block and the actual moving block for the leading and trailing edge separately on a 5-point ITU-R BT.500 quality scale. The viewing distance (50 cm) was chosen to be small enough such that the observers were able to distinguish the smallest sharpness differences. The illumination level was less than 5 lx, measured on the screen.

The results showed that the selected speed for the moving block very closely matched the simulated speed (a regression coefficient of 1.008 with a correlation coefficient of 0.9999). This match was equally as good for the leading and trailing edges. The average scores for the match were above 4.4 in all cases, implying that the edge blur of the simulated image was almost identical to the edge blur perceived in the moving image.

Perceptually Relevant Measure for Motion Blur

A second perception experiment was designed to determine a perceptually relevant measure for motion blur. It is well known that not only the slope of an edge, but also its luminance difference, affects the perceived sharpness of a luminance transition from one level to another. Hence, in this second experiment, we included (1) stimuli with the same BEW values, but different initial and final luminance levels, and (2) stimuli with different BEW values, but with the same initial and final luminance levels. The 13 transitions selected and the corresponding BEW values at four different speeds are summarized in Table 1.

The experimental set-up is shown in Fig. 5. Two reference images were presented at the top half of the screen of a 26-in. LCD: a very blurred one corresponding to transition No. 11 with a motion speed of 16 ppf and a very sharp one corresponding to transition No. 2 with a motion speed of 0 ppf. The test stimulus, corresponding to one of the 13 transitions at a given speed, was presented at the bottom of the display. The participants (12 male and 11 female with an average age of 30.2 years) were asked to score the sharpness of this stimulus on an 11-point scale with the blurred reference corresponding to a 1 on this scale and the sharp reference to a 9. The background luminance on the display was 97 cd/m2. The viewing distance was three times the screen height (about 1 m), and an illumination level of about 5 lx was measured on the screen.

p22_fig3_tif

Fig. 3: Calculated blurred-edge width for the head and tail as a function of motion speed, with the different luminance transitions as a parameter.

 

p22_fig4_tif

Fig. 4: Appearance of the stimuli during the experiment to adjust the speed of the moving block and to assess the correspondence of the edge blur of the still and moving blocks. The yellow arrow indicates the direction of motion.

The resulting perceived sharpness scores (Sh) were linearly fit to normalized BEW (BEWn) and contrast (Cn) values, yielding the equation:

3rd_eq_eps

Here, luminance contrast is defined as

eq_5_eps

This fit is shown in Fig. 6 and has a correlation coefficient of 0.979. As expected, the relationship shows that increasing contrast results in a higher perceived sharpness, whereas increasing BEW results in a lower perceived sharpness. The influence of BEW on sharpness is much higher than that of contrast.

Conclusions

A measurement system, modeling method, and perception protocol were established to evaluate LCD motion blur. The temporal luminance variation is precisely acquired with a fast, eye-sensitivity-compensated photo-diode in combination with a data-acquisition system. The temporal-response waveforms are used as input for the simulation model, which assumes smooth-pursuit eye tracking and temporal integration of the luminance data. The model was validated by a perception experiment, and a high correlation was found between the measured and perceived motion blur. Hence, we can conclude that the


Table 1: Initial and final luminance levels and the blur levels for the stimuli used in the experiment (ppf – pixels per frame).

 

Initial

Final

BEW Values (Expressed In Pixels) for

       

Luminance

Luminance

Edges Calculated With Speeds of:

     
Transition Number

cd/m2

cd/m2

0 ppf

4 ppf

8 ppf

16 ppf

1

468

1.0

0

4.7

8.2

15.2

2

1.0

468

0

4.6

7.9

14.8

3

1.0

295

0

5.6

10.1

19.2

4

38.6

468

0

4.6

7.9

14.6

5

406

54.8

0

4.9

8.7

16.4

6

210

38.6

0

4.9

8.7

16.4

7

54.8

254

0

5.6

10.1

19.2

8

346

114

0

4.9

8.8

16.6

9

174

468

0

4.6

7.8

14.5

10

141

54.8

0

5.0

8.9

16.8

11

114

174

0

5.7

10.2

19.3

12

346

254

0

5.0

8.9

16.8

13

468

406

0

4.9

8.7

16.3

Figure5_tif

Fig. 5: Experimental set-up with a subject assessing the perceived sharpness of the test stimulus (bottom) with respect to the two anchor stimuli (top).

p23_fig6_tif

Fig. 6: A plot of the calculated sharpness versus the average sharpness scores is shown.

perceived edge blur during motion is very well simulated by our model and, thus, that this method is a good alternative to the smooth-pursuit camera system. Lastly, a perceptual metric that predicts the perceived sharpness of a moving edge from the combination of BEW and luminance contrast was established.

References

1I. Heynderickx, et al., "Image Quality Comparison of PDP, LCD, CRT, and LCoS Projection," SID Symposium Digest Tech Papers 36, 1502-1505 (2005).

2Y. Igarashi, "Summary of Moving Picture Response Time (MPRT) and Futures," SID Symposium Digest Tech Papers 35, 1262-1265 (2004).

3C. Teunissen, et al., "Method for predicting motion artifacts in matrix displays," J. Soc Info. Display 14/10, 957-964 (2006).

4C. Teunissen, et al., "Perceived Motion Blur in LCDs," Proc. IDW Digest '06, 1463-1466 (2006).

5T. Kurita, "Moving Picture Quality Improvement for Hold-Type AMLCDs," SID Symposium Digest Tech Papers 32, 986-989 (2001). •