Trends in Video Electronics

Image processing, which takes place before the signal is delivered to a display's drive electronics, is a key factor influencing image quality – and it is in the image-processing circuitry that most product features are implemented.

by Pierre Sicard

THE EVOLUTION OF digital displays, as well as the ever-increasing demand for more features and improved image quality, is driving IC designers to develop more and more sophisticated chips. And it is confusing because there is so many buzzwords, so many different features, and so many ways to do the same thing. It is a digital jungle out there, and for a system designer it can be a real challenge to make the best choice among all the available chipsets. In addition, the designer really must pay attention to the details of each product specification; otherwise, a simple design can quickly turn into a nightmare and run over budget.

On the bright side, video-electronics design can be an enjoyable and exciting challenge for an engineer. Contrary to the situation in other engineering disciplines, the end result not only depends on objective performance criteria, but also on a major subjective notion called image quality. And that notion is a demanding one. Put 10 potential television buyers in front of the wall of TVs at the electronics store and it is guaranteed that they will not all choose the same product as their first choice. It is also guaranteed that each of them will have a different set of criteria for what type of image is preferred.

Image processing, which takes place before the signal is delivered to the display's drive electronics, is one of the key factors influencing image quality. And it is in the image-processing circuitry that most of the product features are implemented. Let us review some of the recent advances in this field.

The Past Five Years

When I started in this business in 1998, the first image scalers – circuits that increase or decrease the number of lines and/or columns in a display – were already available on the market. The chip vendors were starting to compete to clinch the top spot in scaling technology, and each competitor had developed its own proprietary algorithm.

During that same time, we saw the birth of the Digital Visual Interface (DVI) standard, which was devised in an attempt to establish a uniform way for the display to "talk" to the system. Among the major problems was how to come up with a brand-new digital-interface standard while simultaneously supporting the legacy systems based on the analog VGA connector. One can say that, after fierce debate and struggle, the goal was achieved. However, the lifetime of any standard is limited, and we can already see new standards coming to replace and improve upon it.

 


Table 1: Critical Chip Features for Typical Display Applications
Application Critical Features
Medical displays Image zoom, video de-interlacing, color management
Video walls, control rooms Image shrink, frame-rate conversion, flexible output synchronization
Head-up displays Image shrink, image warping
LCD monitors Signal auto-detect, on-chip A/D, LVDS, TCON
LCD projectors Signal auto-detect, frame-rate conversion, horizontal and vertical keystone correction
LCD TVs SDTV and HDTV de-interlacing, color management, PIP, on-screen display, HDTV scaling and aspect-ratio conversion
Broadcast TV 10-bit processing, high bandwidth, HDTV de-interlacing and scaling
Home theater Video de-interlacing, 10-bit processing, image warping, on-screen display

 

After a large wave of consolidation in the market from 2000 to 2001, only a few players had made it to the top. This was triggered by the huge potential of the LCD-monitor market. On the technical side, the vendors had to integrate as much functionality as possible on a single chip. This was an absolute requirement in this cost-sensitive market.

As early as 2001, the need for special processing of video was recognized. This added a new complexity to the algorithms because of the fundamental differences and difficulties in dealing with video as opposed to computer-generated images. Display processors have continued to evolve, with the evolution prompted by a sequence of specific market drivers (Fig. 1). The main market drivers, such as LCD monitors, LCD projectors, and LCD TVs, have directed the vendors' innovations. But other applications impose different requirements on the display electronics (Table 1).

Beyond the 8-bit Frontier

The need for digitizing the image data with more than 8 bits per color channel has been recognized in some applications for several years, but it is only recently that products satisfying this requirement have entered mainstream markets. But who would want more than 8 bits per color channel and why?

Basically, it involves the capture and reproduction of the wide-color-gamut characteristic of motion-picture film. For the human eye to perceive linearly coded signals of colors, about 8 bits per RGB component are sufficient, but with 10 bits per color, the dynamic range of each color component becomes comparable with that of real film, at least from the standpoint of human visual perception. Other sampling techniques, such as logarithmic at 8 bits, are in use, but consumer-electronics products are clearly leading the way toward the adoption of 10-bit linear sampling.

This 10-bit requirement spreading across the marketplace is due in part to the emergence of high-definition television (HDTV). Manufacturers of LCD projectors and high-end LCD TVs are now demanding that all the video processing be done with a precision of 10 bits per color. In broadcast TV, digitization of HDTV signals is standardized in 10-bit format, and all products for this market based on image-processing hardware must support 10-bit processing for each color channel.

Analog Integration?

Integration of front-end analog-to-digital (A/D) conversion into the scaler has been achieved successfully by many chip vendors. This was required for the LCD-monitor market and is a good solution for other products, such as controller boards for LCD panels. In the case of LCD monitors, cost is the dominant factor – it sometimes seems to be the only one – and the addition of an extra component for the A/D conversion is simply not an option.

In applications such as controller boards, cost is critical, but, in addition, real estate can be limited. These boards must have a small footprint, with only one major component if possible.

On the other hand, mixed-signal chip solutions will not be a platform for growth in newer applications such as LCD TVs and digital television. Fundamentally, the integration of the analog conversion block has two main problems: (1) quality degradation and (2) slowing down of the digital integration. Mixed-signal technology is extremely difficult. The silicon must have separate areas for the digital gates and the analog portion. Gate switching on the digital side can cause noise on the analog block. This is why integrated A/D converters on digital chips can not perform as well as their stand-alone equivalents.

 

Fig__1_Sicard_tif

Fig. 1: The continuous evolution of display processors has been prompted by a sequence of specific market drivers. Beginning about 2001, designers began to recognize the need for special processing for video as opposed to computer-generated images.

 

Another factor that weighs against analog integration is that the mixed-signal technology that must be used will not be able to sustain the rapidly increasing scale of integration that is required to meet future demand. An analog block uses a larger geometry than a digital block, so many more transistors can be squeezed into a given area on a digital block than into an equivalent area on an analog block. With the ever-increasing demand for new features that the controller/scaler chip must support, it is absolutely critical for chip designers to use the latest design technology available for the digital portion. When the analog portion is on the same piece of silicon, this may not be possible, thus limiting the level of integration. Chip manufacturers will continue to make mixed-signal devices for some time, but it will soon become imperative to separate the digital portion so that all the features required by the market can be supported.

De-interlacing and Scaling

Because most digital display systems are scanned progressively, it is necessary to convert the interlaced input video source before scaling the image to fit the display's pixel format. There are many different techniques available on the market for de-interlacing, ranging from the simple "field merge" to the much more advanced "motion compensated" solutions. Image artifacts cannot be totally eliminated, but they can be reduced to a very small (and acceptable) level by using sophisticated de-interlacing algorithms. These techniques use a blend of spatial and temporal filtering. At each pixel location, the motion is estimated, and then a decision is made on the proper dosage of spatial and temporal filtering to apply.

This is a lengthy task for a processor, given the number of calculations required in real time to achieve the desired result, especially for high-definition video, which typically has six times the pixels of standard-definition video. In addition, there must be a special provision for dealing with angled edges, which is a source of artifacts if they are not de-interlaced properly. An additional complexity that must be handled when dealing with film-source material is to take into account "bad edit" frames and the pull-down sequence. There are as many de-interlacing techniques as there are vendors. But what counts for a system designer is the end result, and the only way to determine which solution should be used is to evaluate and compare them.

High-performance scalers are now available at affordable prices, but the applications are becoming more demanding. So, even if it is a relatively "old" technology, scaling should not be taken for granted. Not all scalers produce the same level of quality. With the advent of HDTV, the scaling process takes on a new importance because of the need to downscale – reduce the number of display lines and/or columns. This is precisely where scaling artifacts will show up.

In applications such as video walls, where this reduction – or shrink ratio – can be as high as 6:1, the scaling quality is also a key factor. In this kind of application, only a few high-end scalers can efficiently eliminate "aliasing" artifacts.

Video de-interlacing and image scaling are two separate functions. They used to be found on separate components, but it now makes sense to combine them into a single device. Chip manufacturers, such as i-Chips Technology, now offer advanced integrated solutions that combine de-interlacing and scaling, even for supporting high-definition video.

 

Figure_2_left_tif    Figure_2_right_tif   i-Chips Technology

Fig. 2: Color management improves image quality. Color enhancement as implemented by the i-Chips Technology processor is turned off (left) and on (right).

 

Color and Contrast

Color is a science in itself, and it is taking on increasing importance in modern video-electronics systems. Far from the old "color" and "tint" buttons in first-generation color TVs, the new digital TVs and high-end projectors, as well as other systems, require advanced color-management techniques. There are two main reasons for this. The first is to achieve product consistency independent of the components used, and the second is to adapt to market preferences. In the world of LCD TVs, it is important for manufacturers to achieve a consistent color temperature throughout their product line. This is difficult when the same product can be made with different versions of critical components such as the LCD panel, video decoder, and de-interlacer/scaler, or when these components are sourced from different manufacturers. Therefore, it is important to have a mechanism to balance the inequalities. The ability to adjust to market preferences is necessary because not everyone perceives colors in the same way.

Color management can be achieved digitally in systems having various degrees of complexity. One simple form of color management is to change the gamma table on the fly depending on the type of input image or other criteria. More advanced techniques make use of programmable mapping of the color coordinates and even the adjustment of specific colors, which is called color warping (Fig. 2).

Other functions, such as black-and-white stretch and automatic contrast, must be available in the digital video chain. One of the key features that display processors must be able to provide is the real-time measurement of the intensity histogram of the image. At every frame, the histogram is calculated to determine the dynamic range, which is needed to determine the optimal contrast level. A black-and-white stretch operation can then be performed to automatically darken the low-level intensities and lighten the high-level intensities. This provides more visual impact and improves the viewing experience. Edge and detail enhancement is another way to improve the image quality of moving pictures (Fig. 3). This processing stage is normally required after image scaling to recover the lost high-frequency contents of the original image. The degree of sophistication of this stage is quite critical in achieving acceptable image quality, and the results can vary widely from vendor to vendor.

Conclusions

This is an exciting time for the video-electronics business. The coming years will be characterized by continuous product differentiation and innovation, and this will increasingly be implemented at the chip level. The markets served by video electronics are getting more segmented vertically; each application requires specific and distinct capabilities. Distinct chip features will have to be targeted at each market and product type. High-end markets, such as digital cinema, home theater, and broadcast TV, will see tremendous growth, and each will require customized digital processing.

As the main focus migrates from LCD monitors to consumer TV, chip designers will be faced with new challenges. Because the pressure for high integration driven by the absolute necessity to achieve low cost has been eased, it is now a question of finding innovative solutions and algorithms to provide an enjoyable viewing experience for the end user. This calls for a great deal of expertise and creativity.

While existing algorithms will continue to improve, new techniques and features will emerge to further enhance the viewing experience. In many cases, a digital solution requires an extremely complex implementation in order to produce a relatively simple effect on the image – such as noise filtering, de-interlacing, or automatic contrast. Ideas on how to develop higher-performance image algorithms will be provided by our understanding of human visual perception. Studying how the brain perceives images and moving pictures will lead to innovation.

The challenge will be not only to improve the image quality, but also to make fundamentally better products. How does one make a more "user-friendly" product? This will require automating the adjustments – such as automatic keystone correction in projectors. This will in turn require embedding more intelligence in the electronics and possibly integrating other technologies, such as sensing and pattern recognition. But ultimately, all this technology must provide the end user with a more enjoyable viewing experience. •

 

Figure_3_left__tif    Figure_3_right_tif i-Chips Technology

Fig. 3: Detail enhancement to improve image quality as implemented by the i-Chips Technology processor is turned off (left) and on (right).

 


Pierre Sicard is Product Manager at Jepico America, Inc., 2041 Mission College Blvd., Suite 250, Santa Clara, CA 95054; telephone 408/844-0530, ext. 101, fax 408/844-0536, e-mail: psicard@jepicoamerica.com.