We live in a highly tactile world. However, our daily interactions with technology, from tablets to automobiles, are becoming more and more flat. The next wave of user interface will re-incorporate dynamic physical features with the introduction of a novel deformable membrane technology that integrates into standard touch displays and surfaces.

by Nate Saal

DATA-ENTRY ERRORS, poor typing speeds, and lack of tactile feedback are some of the problems that consumers encounter when using virtual touch-screen keyboards and touch-input devices. Interacting with a touch screen requires constant visual monitoring. While possibly inconvenient in an application such as a game, a touch interface with its lack of physical buttons or other haptic-feedback scheme can be dangerous in an automotive environment where touch screens are used for tasks such as changing the radio station or interacting with the navigation system.

The Tactus Tactile Layer panel was developed to provide a next-generation user interface with real physical buttons, guidelines, or shapes that rise from the surface of a touch screen on demand and can be employed without visual confirmation from the user. The Tactile Layer component is a completely flat, transparent, dynamic layer that sits on top of the touch sensor and display. The thin layer deforms and buttons or shapes of a specific height, size, and firmness emerge from the surface when triggered by software API, a proximity sensor, or an other event. Users can feel, press, and interact with these simulated buttons just as they would with the buttons on a physical keyboard (Fig. 1). When the buttons are no longer needed, they recede into the surface and become invisible. 

Fig1

Fig. 1: The Tactus Tactile Layer can deform so that keyboard buttons or other shapes emerge from the surface.

 This new interface from Tactus Technology was launched at Display Week 2012 and was awarded SID's Innovation-Zone trophy for Best Prototype (and the Grand Prize in the Eureka Park Challenge from CEA shortly afterward). The interface allows different pre-configured sets of buttons, such as a QWERTY keyboard, to be raised (emerging out of the touch screen) or lowered (receding back into the touch screen) based on the application need. Not just limited to keyboards and on-screen buttons, the tactile technology can also be integrated off-screen, such as on the bezel or the backside of a device.

How the Tactile Layer Surface Works

The tactile panel is easy to integrate; it simply replaces the glass or plastic cover layer that sits on top of a touch sensor and display. It is a thin, flat, smooth, and transparent cover layer varying in thickness from about 0.75 to 1 mm that has certain special properties.

The top-most layer of this multi-layered stack consists of an optically clear polymer. A number of micro-holes connect the top layers of the panel to a series of micro-channels that run through the underlying substrate (Fig. 2). The micro-channels are filled with a fluid whose optical index of refraction matches that of the surrounding material, making it fully and evenly transparent when light from the display passes through (Fig. 3)

Fig2

Fig. 2: A cross section of the Tactile Layer panel shows the microfluidic channels (light blue) and the embedded microstructure (dark gray) with micro-holes, which in this example are 200 μm wide.

Fig3

Fig. 3: The micro-channels contain an indexed-matched fluid that hides the embedded structure.

Increasing the fluid pressure causes the fluid to push up through the holes and against the top polymer layer, making it expand in specific locations (Fig. 4). This enables an array of physical and completely transparent buttons to rise out of the surface. A small internal tactile controller that interfaces with the processor of the touch-screen device controls the rise and fall of the buttons.

Fig4

Fig. 4: Increasing the fluid pressure in the panel causes the top layer to expand, creating physical buttons.

The tactile controller allows a proximity sensor or a software application to control the state of the buttons (Fig. 5). For example, the buttons can be triggered to rise whenever the software calls for the virtual QWERTY keyboard.

Fig5

Fig. 5: The Tactile Layer panel at top integrates into a typical touch display stack, replacing the flat, static cover layer with a dynamic, physical surface. The Tactile Controller (bottom) drives fluid in and out of the panel.

It takes less than 1 sec for the buttons to rise or recede. Once formed, the buttons are stable and users can rest their fingers on them or type on them just like a regular keyboard. When the buttons are not needed, the controller triggers a reduction of the fluid pressure. The buttons recede back into the Tactile Layer panel and the surface becomes smooth and flat again. The panel size as well as the size, shape, and firmness of the buttons are fully customizable. Buttons can be of any shape – circles, rectangles, ovals, squares, long thin lines, or even ring- or donut-shaped.

The Tactile Controller is the main fluid drive mechanism. It comprises an actuator, valving, and a fluid reservoir. A typical tablet form-factor device with a QWERTY keyboard configuration requires less than 2 ml of fluid. The fluid is a proprietary oil that is odorless, colorless, non-toxic, and, critically, non-conductive (otherwise, it would negatively impact the touch-sensor function). In the event of a catastrophic failure, the fluid will not harm any other components in the device. Additionally, since the tactile panel is independent of the touch sensor and display, even if the panel were to fail, the core device function would remain intact, similar to devices today that still operate with a shattered screen.

The buttons' height (from high to low) and feel (from soft to rigid) can be controlled, allowing consumers to choose and set their personal preference. It is possible to create almost any type of button configuration or layout on a panel. Multiple button sets can also be configured on a single panel, enabling different groups of buttons to be raised at different times, depending on the interface needs of the user.

Input Comparisons

When considering typing applications, alternative typing methods such as finger swiping, predictive text, auto-correction, and voice input have all been devised to improve the accuracy and efficiency of inputting text. Haptic technology provides feedback that simulates a physical button experience. Much like a clicking sound that occurs when a button is pressed, haptics can help users understand when an input has been made. Vibration-based haptics, for instance, uses vibratory feedback to mimic the feeling of resistance when pushing a virtual button. Current haptic technologies, however, fall short in assisting users in properly locating their fingers on a screen or keyboard. Touch-typers need the "home row" for orientation and thumb-typers need to be able to glide across keys; without these interface capabilities, mistakes will continue to be frequent.

Think about how most input systems function: users locate the target, touch it, then trigger it. What users need is the ability to orient their fingers by touching and feeling the screen, and only then input data with an intentional push of their finger or fingers. Many current capacitive touch-screen devices have an inherent problem in that as soon as you touch the screen, input is triggered. Even if input is triggered on liftoff, this still does not allow users to rest their hands on the screen the way they would when touch typing on a physical keyboard.

What if you could touch a touch screen without triggering input? At first glance, this seems to run counter to how touch screens function today. The answer lies in creating a new dimension of touch – literally by enabling a touch screen to deform in the Z-axis, moving the finger further away from the touch panel.

Capacitive touch screens work by measuring a change in capacitance as a finger moves closer to the touch surface. Tactus takes advantage of this mechanism, using it to enable users to rest their fingers on the buttons and as a result only input data when the buttons are pressed down. When the Tactus buttons are flat (recessed), a finger touching the screen or resting on a button lies close to the underlying touch sensor, strongly changing the local capacitance. But when the buttons are raised, the distance between the top of the buttons and the touch sensor is increased. As a result, when a finger rests on top of a raised button, it is further away from the touch sensor, and there is a relatively smaller change in capacitance. When a finger presses a Tactus button down, the capacitance changes as the finger comes closer to the touch sensor. The difference in capacitance due to finger-sensor distance may be used by the touch system to clearly distinguish between a finger resting on the buttons compared to when buttons are being pressed.

Power-Consumption Issues

The power consumption used by the tactile controller to actuate the panel is exceedingly low. The tactile system runs off of 3.3 V with a peak current of 300 mA over the activation period of about 1 sec. In a frequent usage scenario with 100 activations in a day, power consumption would be less than 1% of a typical 1500-mAh battery. Once the buttons are raised, they remain enabled for as long as they are needed – be it a few seconds or several hours – without any additional power consumption.

This is possible because the pressure used to raise the buttons remains constant even when the power to the system is shut off. When the buttons are pushed down, the inherent internal pressure causes them to automatically pop back up each time without additional power consumption. In contrast, haptic vibration-based solutions consume battery power with each button push, so the more a user types, the more power is consumed.

Market Opportunities

Tactus predicts that its innovations will have significant impact for the use of microfluidics in display applications and touch screens. In its first public demonstrations, Tactus showcased its technology on a 7-in. prototype Google Android tablet. The demonstration system was the result of a new partnership between Tactus and Touch Revolution (Redwood City, CA), a unit of TPK Holding Co., Ltd. – the largest-volume glass projected-capacitive multi-touch-screen manufacturer in the world.

Tactus is currently working with industry partners such as touch panel, display, and touch-controller companies to provide complete solutions that can easily be integrated by device OEMs. Smartphones and tablets are obvious candidates for tactile interfaces. Some individuals resist or have resisted the move from a smartphone with a physical keyboard to a touch-screen device. Many who purchase a tablet also buy a physical keyboard accessory for easier typing. In just 3 years, the nascent tablet market has grown to about 60 million units in 2011. Tactus can help transform these devices typically used for content consumption into devices for content creation.

Another important aspect of this technology is its ability to reach segments of the population that cannot currently operate touch screens. The blind and visually impaired, the elderly, and those lacking fine motor skills because of diseases such as arthritis or Parkinson's, either struggle or find themselves completely unable to use 'buttonless' touch-screen-based systems and devices. Providing a tactile interface on touch-screen devices is important for people with vision impairment or those who lack fine motor skills.

The medical and public-safety industry is another likely avenue for the technology because the buttons allow for easier typing while also providing a smooth surface that can be easily sanitized. When the buttons are retracted, the unit can be easily wiped clean of germs and disease that regular keyboards trap and may even spread among users. Contamination of mobile and computing devices has been validated by scientific research and is becoming a significant concern for hospitals and medical facilities. This technology can also be used to make a wide variety of medical devices significantly more portable. For example, ultrasound systems – even "portable" ultrasound systems – have a separate screen and keyboard, making them heavy and bulky.

In summary, the evolution of the haptic interface into the tactile interface is upon us. The ability to use portable devices for content creation via a "contextual" keyboard that appears only when needed, or video controls on a remote that disappear when no longer needed, is the opportunity presented by microfluidics applied to touch-screen technology. As the technology for dynamic surfaces advances, even more opportunities for user interfaces and tactile experiences will be developed that will far surpass the ideas presented in this article. It's an exciting opportunity for both OEMs and end-users. •


Nate Saal is the VP of Business Development for Tactus Technology. He is responsible for OEM relationships and partnerships. He is a serial entrepreneur and over the last 15+ years has been involved in software, middleware, and hardware start-ups. He can be reached at nate@tactustechnology.com.