Display Week 2015 Show Daily Highlights
Engineers, developers, investment bankers, analysts, and more all headed to the heart of Silicon Valley last June to visit Display Week 2015. According to show organizers, this year’s attendance was up more than 10% over last year’s. If you were lucky enough to be one of those attendees, one thing was for sure – there was a lot going on.
by Jenny Donelan
WITH a technical symposium, seminars, short courses, business and marketing conferences, and a three-day exhibition that included the always-popular Innovation Zone, Display Week offered more than one person could possibly see. Fortunately, you did not have to. Information Display magazine’s crack reporters – Tom Fiske, Steve Sechrist, Geoff Walker, and Ken Werner – were on the job, homing in on specific areas of technology and sharing their discoveries via blogs throughout the show. (They are also writing longer articles that will appear in our September/October post-show issue.)
We think the blogs are one of the best things Information Display offers all year, so we decided to share several of them in print – one from a technical seminar, one from a keynote address, one from the exhibition, and one from a market focus conference. If you want to read more, visit www.informationdisplay.org, click on “Blogs and Newsletters,” and select 2015 Display Week.
The question of how you know how far away something is may not grab your attention in everyday life, but it is very important to engineers trying to create perceptually correct 3D displays. Dr. Kurt Akeley from Lytro, Inc., spent the first third of his Monday Seminar, “Stereo 3D, Light Fields, and Perception,” on this question.
The answer that is commonly assumed is “binocular depth cues,” including vergence (rotation of the eyes toward a fixation point), accommodation (adjustment of the focal length of the lens in the eye to match the fixation distance), retinal disparity (the out-of-focus retinal images of objects closer or further away from the fixation point), and binocular parallax (the difference in the images sensed by each eye).
Everybody automatically uses retinal disparity (also known as stereopsis) as a depth cue without thinking about it. When you look at an object some distance away, the relative blurriness of objects closer and further away gives your vision system a “context” that helps it judge where the object is in space.
Image blur also affects perceived scale. In the left-hand photo in Fig. 1, the city looks normal. However, in the right-hand photo, the background and foreground have been blurred. Since we tend to assume that blurred objects are close to us, the city suddenly looks like a miniature model.
While binocular depth cues are important, and depth-sensing can be achieved using binocular parallax even if all other depth cues are eliminated, there are many other depth cues. Some of these others include retinal image size, texture gradient, lighting, linear perspective, aerial perspective, motion parallax, monocular-movement parallax, and occlusion. You can read about those in the full blog entry, available at http://idmagazinedisplayweek2015.blogspot.com/2015/06/bygeoff-walker-how-do-you-know-how-far.html
Fig. 1: Image blur affects perceived scale. The left-hand photo shows a regular aerial shot of a city. In the right-hand photo, the background and foreground have been blurred, making the city look like a miniature model. Images from http://graphics.berkeley.edu/papers/Held-MBT-2008-05/VES2008slides_HeldEtal.pdf (Martin Banks et al., UC Berkeley).
Keynote: Intel Shows Its Vision of the Interactive Future by Tom Fiske
Intel Corp.’s CEO Brian Krzanich gave a compelling keynote address on the opening day of the exhibition. His thesis: the relentless pace of Moore’s law will lead to richer and more engaging interactivity with our devices. RealSense, a collection of sensors and software developed by Intel, enables 3D scanning and sensing of the environment. Krzanich and his colleagues demonstrated face recognition and hand-gesture control, face-to-face interaction for online video gamers and remote meetings, and technology for more efficient warehouse management. He also demonstrated 3D scanning to 3D printing, real-time collaborative remote working with virtual 3D objects, a floating “piano” interface, and augmented-reality interactive gaming on top of a real-world space. All of these can be made possible by enhanced and rich sensing of the ambient environment and the user.
Krzanich delivered the message that we are on the cusp of something new, and I do not doubt it. New and compelling applications will certainly be found that take advantage of these and other emerging capabilities. Technical developments make these things possible, but we may also need to reveal more of ourselves to gain convenience or capability – such as when we let an application know our physical location in order to use mapping software. Like any technology enhancement, we have to make our own determination as to when the technology adds sufficient value to induce us to part with our dollars – and to give up a bit more of our privacy.
Something special you might have seen at Display Week last June was an impressive 10K display from BOE at the Display Week exhibition (Fig. 2). The 10240 × 4320 pixel display (in 21:9 format) is a “technical development” model that came in a 82-in.-diagonal display. Development engineer Xinxin Mu of BOE told us the panel is a one-off that demonstrates the cutting edge of the high-resolution capabilities of BOE as the company begins looking downstream at the future of both display size and resolution. The panel uses a direct-LED-backlit scheme that is the major reason this behemoth set consumes a whopping 1100 W of power. She also said pixel addressing is done from both top and bottom and uses a standard a-Si backplane.
Even at close-in distance, individual pixels were beyond human visual acuity (at least this human’s pair of eyes) and close inspection of the amazing video images (provided by an upscaled NHK source) revealed such minute detail like a single bird discernable in a wide city-view shot, sitting atop the Brandenburg Gate in Berlin or details of the rotating restaurant from a distance shot of the Berlin TV Tower. The images were simply stunning. (This display won a Best in Show award at Display Week.)
BOE PR rep Aly Langfeifei told us the display is meant to underscore just how far China-based fabs (and BOE in particular) have progressed in their technology development. We were also told work is on-going to modify the technology and prepare it for commercial release in the (not too distant) future.
Fig. 2: Xinxin Mu (left) and Aly Langfeifei stand next to the 10K display at Display Week.
On Thursday morning at Display Week, the “Special Wearables Address” in the Market Focus Conference was given by Sidney Chang, Head of Business Development for Android Wear, whose topic was “Android Wear Overview and Google’s Wish List.” (Chang replaced Fossil CTO Philip Thompson, whose scheduled talk was “Why Wearables with a Display Will Not Succeed with Today’s Display Companies.” I have been assured that the switch was due to a scheduling conflict and not because Thompson was planning on telling us, quite accurately, that display and computer companies cannot be trusted to design watches!)
Chang’s approach was not confrontational, but he had interesting things to say, some of them aimed directly at display makers. The first was that display makers should think very hard about “improving” traditional display parameters if they impact battery life. Although outdoor visibility is essential, it should be done in ways other than cranking up the luminance. The display must always be on, but it does not always have to be on in the same way. Chang described two modes. The “interactive mode” has full animation and full refresh rate. “Ambient mode” has reduced color depth, reduced brightness, and reduced refresh rate for showing basic information, like the time, whenever the user looks. (Pixtronix and Sharp, are you listening?)
Chang specifically discouraged display makers from going to 300 ppi for watch displays. The extra pixel density is not needed for most watch apps, he said, and most watches cannot tolerate the hit on battery life.
Chang showed the results of user studies done by Android Wear. Not surprisingly, users want the thinnest watch they can get. Many women feel that current watches, although arguably appropriate for men’s generally larger wrists, are too large for theirs. Average wrist diameters are 17.5 cm for men and 15.0 cm for women. Average wrist breadths are 5.8 cm for men and 5.2 cm for women. When a group of users (presumably both male and female) were asked whether they preferred a watch diameter of 1.0, 1.1, 1.2, or 1.3 in., there was a strong bi-modal preference of 1.0 and 1.2 in. Of these, participants over 40 years old preferred the smaller size, while participants under 40 preferred the larger. (Display makers, do not try to sell 1.5-in. displays to watchmakers!)
A general issue is trying to meld the very different approaches of watchmakers and people from the display and mobile systems communities. Chang noted that watchmakers and watch users prefer choice and variety. In 2014, Fossil had 8000 watch SKUs under 15 different brands. Typical sales for each SKU were thousands to tens of thousands of units. Since Google Wear released its API, the most popular apps have been different watch faces, with one app allowing the user to take a selfie of his or her clothing and then match the color of the watch face to the color of the clothes.
Forging compatibility between the watchmakers’ need for variety and the display- and system-makers need for volume will be an ongoing topic of conversation. •