Gazing at the Future of Monitors
A strange contradiction exists in the display business. While there have been major improvements in many of the displays in our lives, including mobile devices and TVs, few of these improvements have reached the displays many of us spend the most time with – desktop monitors.
by Bob Raikes
IN recent years, there have been dramatic upgrades in the visual performance of displays, many of them driven by Apple’s marketing of the concept of the “Retina display.” Apple took advantage of the great technology available from display makers to create products that simply looked better. Consumers, it seemed, were happy to pay more for the compelling visual and intuitive interactive experience. TVs have also undergone sweeping improvements in terms of form factor (thin and/or curved), resolution, and price.
Yet, the revolution in mobile displays, for example, that is adding more than 500 ppi, wide color gamut, wide viewing angles, and high brightness to our devices is barely impacting the world of desktop monitors, the display we spend the most time with. As somebody said to me this week: “I spend more time with my monitor than with my wife!”
Admittedly, there has been an increase in the share of the market taken by monitors with in-plane-switching (IPS) panels for wider viewing angles, and there is a move by some vendors to boost vertical-alignment (VA) LCD panels in monitors, but the vast majority of shipping monitors use twisted-nematic (TN) technology – long abandoned in tablets and smartphones. There is a huge installed base (probably close to a billion) of monitors with only 70–100 ppi, poor viewing angles, and low color performance.
Although mobile devices are garnering an ever-increasing market share, monitors will continue to be important for most PC users. We saw, after the advent of netbooks, that good-sized displays are essential for heavy usage, and, of course, in the European Union, employers are obliged by health and safety law to provide users, even those with notebooks, with monitors that can tilt and swivel. Many netbook buyers later went back and bought monitors to use with them. The peak of netbook sales in 2010 is the only period in which monitor sales have not dropped in recent years.
The Unchanging Desktop
Another strange thing about the desktop experience is that it has not really changed for a very long time. I remember seeing a Perq workstation at a technology event around 1980 or 1981 that had a graphic screen, a keyboard, and a mouse. The interaction model with desktop PCs has not changed significantly since, although it was Apple that made it accessible and Microsoft and Intel that made it pervasive.
At Display Week 2014, Samsung did show some monitors in a new, curved form factor designed to create a more immersive desktop experience. But it seems to me that there is a great opportunity for someone to do for the desktop what Apple did for the mobile phone, by transforming the interaction experience. Over the last 2 or 3 years, I have been repeatedly asked by clients, “What about touch on the desktop?” For a couple of years, I made it pretty clear that touch on the desktop was not going to be important. If you hold a device in your hand, touch is clearly a good idea. If you have the display attached to the keyboard, as you have in a notebook, it could work, but doesn’t make as much sense.
However, desktop monitors are used at a further distance from the operator than notebooks or tablets (it is a function of the human visual system that it is more comfortable if displays are at a distance closer to the “resting point of vergence” – typically 90 cm when the user is looking slightly down). That means that to touch them, the arm has to be fully extended and that is not a practical maneuver throughout the working day. I can see the headlines about repetitive stress injury in shoulders!
Alternatives to Touch
I am nevertheless convinced that a transformation of the user experience is possible. Some people have suggested to me that sound input might be a good match for
computer input, but most office workers do not want to say out loud what they are writing (the people in the airline seats around me as I write this article would not be very happy if I were dictating it). Furthermore, intensive use of the voice is tiring and uncomfortable, as anyone who has run a full-day training course will know.
Could it be gestures? I can imagine some kind of swipe or zoom gesture, close to the keyboard, being useful, but it’s hard to see what advantage there is to swiping a screen over swiping a touchpad, for example, so I cannot back that idea for general input.
I think there is a technology that can transform the user experience and that is gaze recognition. With this technology, a system consisting of a couple of cameras and an infrared light source can track where on the screen the user is looking. It’s not as accurate as using a mouse or trackpad, but it’s broadly as accurate as touch. You do need an operating system that is designed to work well with touch. That’s not the problem it used to be, now that Windows 8 is here – although the touch interface has not proved popular with desktop users. (Could that be because it is not used with gaze?)
Personally, I like to have my mouse set up so that it is not particularly sensitive, which makes fine work easier. However, it also makes it painful to move the cursor a long way, and I frequently lose my tracking of the cursor when I’m back at base with my notebook and a large desktop display. How much better it would be if the cursor moved to where I was looking!
Gaze Needs Multi-Modal Input
Although I have seen demonstrations of gaze being used alone for input (as it might be by users who have accessibility issues with keyboards and mice), it seems to me that the real key to unlocking the power of gaze is to combine all the different forms of input for the PC. For example, it would be great just to look at a Google Map and press the zoom button on my ergonomic keyboard or turn the wheel on my mouse and zoom to the point I’m looking at. I would love to be able to look at a word on the screen I am using and hit a function key to check the spelling.
It occurs to me that we really do not make the most of our limbs. Combining gaze with a foot pedal would be much better for me, as a heavy-duty writer, as it would mean that I would not need to take my hands away from the keyboard. I want to drive my PC like I drive my car!
Of course, the gaze recognition does not have to be built into a monitor, but it makes sense to do that, especially as monitors often have integrated USB connections that could send gaze data back to the host system. Some monitor makers are already looking at this – Tobii had a BenQ prototype monitor with an integrated tracker at CES (Fig. 1) and a notebook with an integrated sensor at CeBIT in March in Germany.
The company is also working with gaming peripheral manufacturer SteelSeries to modify PC games to exploit what gaze recognition can do. The first games should be in the market by the time you read this.
In summary, it seems to me that the market is ready for someone to significantly transform the user experience of desktop computing, just as Apple did to the experience of using a phone. It seems the least that could be done for the display we spend more time with than any other. •
Fig. 1: Tobii showed a BenQ monitor with eye control at CES last January.