Next-Generation Head-Up Displays

Touch-Panel Market Dynamics and Trends

Despite market maturity in consumer applications, the touch industry is experiencing new, dynamic competition that will lead to enhanced user experiences in 2015.

by Calvin Hsieh

BENEFITING FROM new tablet PC applications and a technology shift from resistive to projected-capacitive (PCAP) with higher average selling prices (ASPs), the entire touch-panel market enjoyed remarkable shipment and revenue growth from 2010 to 2013.  Revenue growth was particularly notable, increasing by almost 60% Y/Y in both 2010 and 2011.  However, after 2013, fierce competition began driving a revenue decline.  In 2014, shipments were up 13% Y/Y, but revenues declined 10% Y/Y due to intense pricing pressure (Table 1).

The competition is not finished yet.  The industry will see further supply-chain consolidation.  And the technology evolution continues.  We expect to see emerging trends such as embedded touch from panel makers, enhancement of user touch-screen experiences, and new applications.

Table 1:  The overall touch-module market shipments and revenues forecast shows a slowing of growth through 2018 and declines or modest increases in revenues.  Source:  IHS Quarterly Touch Panel Market Analysis Report Q1’15.
Shipments (000s)
2012 2013 2014 2015 2016 2017 2018
Grand Total
1,290,641 1,511,751 1,701,456 1,922,022 2,114,063 2,279,280 2,401,443
Y/Y Growth
17% 17% 13% 13% 10% 8% 5%
Revenues (US$000s)
2012 2013 2014 2015 2016 2017 2018
Grand Total
$23,224,147 $28,557,077 $25,605,443 $24,555,622 $24,279,585 $24,489,829 $24,817,533
Y/Y Growth
41% 23% -10% -4% -1% 1% 1%


Touch-Panel Market Review

From 2010 to 2013, there were two critical factors that helped to increase touch-panel demand; one was the iPad, which created a new, larger-sized application in 2010 and the other was the Android 4.0 operating system, which enabled non-Apple brands of smartphones and tablet PCs to become more popular.  In the meantime, smartphone penetration in emerging markets was deepening, especially in China and the Asia-Pacific (APAC) region.  Mobile phones have been responsible for more than 76% of shares for all touch-module shipments.

Tablet PCs represent a more than 14% share of the market by unit volume.  Compared to mobile phones, the larger size of tablets gives them the potential to generate higher revenues.  However, tablet PCs encountered a shipment decline in 2014 (239 million with a 4% Y/Y decline) because of product maturity and weak replacement demand.  Many experts expected Windows 8 to create a new growth dynamic for touch in 2013, bringing it to a wider range of devices including laptops, but it failed.  Penetration of notebook applications reached only 11%, for example.  Consequently, mobile phones have been the major application driving shipment growth in recent months.

As for revenues, 2014 was a turning point.  After touch makers experienced high growth for years, revenues declined 10% Y/Y in 2014.  Add-on-type PCAP panels accounted for 65% (embedded PCAP panels are not included in this figure) of mobile-phone applications; resistive panels had a 15.5% share in 2012, this dropped to 0.8% in 2014.  The positive influence of larger-sized applications and technology shifts disappeared in 2014.  Most gross profits of makers were lower by 7–8% in 2014.  The average was >10% in 2013.  Taiwan-based TPK, one of the top touch-solution providers, showed a 14–18% decline during 2012.

In the near future, we might not see increased revenues for large-sized touch-enabled applications with remarkable volumes.  Some new applications, such as automotive displays and smartwatches, may be helpful, although the demand for these will not be as huge as for smartphones.  With the slight possible exception of the Apple Watch (selling briskly at this writing), touch-enabled smartwatch applications are still uncertain because the long-range behavior of end users regarding them is uncertain.  Automotive applications, however, look more promising.

During the first half of 2014, Apple and Google officially announced their automotive plans – CarPlay and Android Auto, respectively.  Each is designed to bring smartphone OS platforms, apps, and services into automobiles.  Instead of replacing the existing embedded automotive telematics systems, the designers have just had automobiles leverage smartphone eco-systems that allow users access to whatever services they already have with their smartphones.  A Lightning or micro USB port is used for connecting.  When the smartphone is docked, it can use the interior control panel for display output as well.

The availability of this functionality should encourage auto makers to better design the displays in cars.  The control panel in the central position could be an ideal position (dashboard cluster displays are usually directly in front of the wheel).  Without the smartphone docked, the control-panel display is still used for information about interior vehicle settings such as air conditioning, entertainment, and navigation.  After docking, the same display can display services from the smartphone for use.  Users can easily switch the sources of the display back and forth.

There are more than 28 auto makers now supporting CarPlay and Android Auto.  This means auto makers understand that the smartphone industry can be a supporter instead of a competitor.  New user interfaces such as touch screens will continue to be introduced and generally adopted, replacing substantial buttons.  Compared with other user interfaces such as speech recognition and gesture, touch-screen technology is currently more mature and efficient to use.

Embedded Technologies from Panel Makers

Mobile phones are still the most critical application for touch, but the component supply chain suffers from fierce competition and serious ASP challenges.  Besides touch modules, mobile-phone-display ASPs also have dramatically declined in recent years.  Panel makers are more interested in embedded varieties that will increase their ASPs.  Since Apple’s in-cell touch was successfully adopted for the iPhone 5 in 2012, panel makers have been encouraged.  In 2014, embedded technology (including in-cell TFT-LCD, on-cell TFT-LCD, and on-cell AMOLED) grabbed a more than 35% shipment share of mobile-phone applications (Table 2).

Table 2:  Touch-module shipment shares for mobile-phone applications show dramatic increases in PCAP shipments through at least 2019.  Source: IHS Quarterly Touch Panel Market Analysis Report Q1’15.
 Units (000s)Technology Year              
2012 2013 2014 2015 2016 2017 2018 2019
 In-cell TFT-LCD 7.5% 12.5% 15.9% 16.3% 16.8% 17.4% 18.0% 18.7%
 On-cell AMOLED 12.0% 16.2% 14.8% 15.2% 15.3% 15.3% 15.4% 15.4%
 On-cell TFT-LCD 0.0% 0.4% 5.2% 6.9% 8.4% 9.4% 10.0% 10.4%
 Projected Capacitive (add-on types) 65.0% 67.2% 63.4% 61.0% 59.2% 57.8% 56.6% 55.5%
 Resistive 15.5% 3.6% 0.8% 0.5% 0.3% 0.1% 0.0% 0.0%
 Total 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0%


Embedded touch technology has been around for years, but the only major adopting brands before 2013 were Apple and Samsung.  Apple has its own in-cell patent (segmented Vcom; USPTO 8,243,027) and Samsung Display Co. has adopted on-cell AMOLED.  Although panel makers have proposed some in-cell technologies and structures beginning at around 2007, all of them have failed in terms of mass production.  Almost all available and produced in-cell sensor structures now are related to Apple’s patent and core concept – segmented Vcom.

On-cell AMOLED (by RGB stripe) has recently benefited from technology that incorporates encapsulation glass without a color filter.  Consequently, the sensor-patterning process has a lower impact on panel fabrication.  Due to Samsung’s resource occupation and product positioning, on-cell AMOLED is generally used for the company’s premium models.  In 2015, Samsung plans to adopt more on-cell AMOLED displays for the mid-range, due to declining handset market share and sufficient Gen 5.5 capacity.  On-cell AMOLED can probably be the differentiation point for entry- and middle-level products.  On the other hand, on-cell LCD grabbed its initial success during the 2013–2014 time frame, as the middle- and entry-level smartphone markets boomed in China.

On-cell-touch TFT-LCD sensor patterning places the touch sensors (Tx, Rx electrodes, and traces) on the top side of the color-filter glass, but beneath the polarizer.  Earlier, in 2009, panel makers adopted single-side ITO (SITO; both X and Y electrodes are located on the same substrate side but with the bridge or jumper for insulation) patterning for the smartphone market but without success.  Since 2013, single-layer patterning for on-cell TFT-LCD has brought new opportunities to panel makers.  The market share reached 5.2% of all touch-module shipments for mobile-phone applications in 2014; it was even less than 0.5% in 2013.  During 2014, more panel makers in Taiwan and China developed on-cell TFT-LCD for entry- and mid-level smartphones.  Additionally, 7–8-in. tablet PCs and >10-in. notebook-PC applications have become the focus of these panel makers.  It is still too early to make a judgment on the outcome of this war of touch-sensor integration (embedded technology).  However, panel makers have some advantages such as business scale and capital so as to pose a real threat to the touch-module makers.

However, not all panel makers prefer on-cell LCD.  JDI is famous for its hybrid in-cell (Pixel Eyes) and LG Display for its “advanced in-cell touch” (AIT) based on self-sensing principles.  We can expect there will be new competition in the embedded sector during 2015.  On-cell development and adoption was earlier and is faster than in-cell (Apple not included), but it is still not possible to say that on-cell is the final winner among panel makers.

The definitions and differences of in-cell and on-cell have no clear criteria.  Conventionally, the sensor patterned on the upper glass (encapsulation or color-filter glass) was called “on-cell” and the TFT backplane glass “in-cell.”  But some makers used sensors embedded with a black matrix (color-filter glass) and called it “in-cell.”  Or, the X-Y electrodes were respectively placed on the upper and backplane glass to create “hybrid in-cell.”  Now, in 2015, the new definition for “on-cell” seems to mean that the sensor is patterned on the top side (facing users) of the encapsulation or color-filter glass.  At least one electrode (usually Tx) at the position inside the open cell or display component (that is, between the bottom and upper glass) is categorized as “in-cell.”  This new definition fits what most makers are now developing.

Synaptics proposes “hybrid in-cell (HIC)” and “full in-cell (FIC)” for its in-cell structure variations.  The former (HIC) has X-Y electrodes (Tx and Rx) on the backplane and a color filter, respectively, such as is used for JDI’s Pixel Eyes.  In FIC technology, the X-Y electrodes are only on the TFT backplane, such as for LG Display’s AIT.  Besides JDI and LG, Apple’s patent is the most popular in-cell technology and structure.


Apple’s in-cell patent and technology use a key principle: segmented Vcom.  Before the patent, panel makers designed specific sensor parts into the displays and took advantage of capacitive (charge sensing), resistive (voltage sensing), or optical (photo-sensing) principles.  Almost none of these were ever produced (except in a notebook from Sharp in 2009).  Segmented Vcom makes use of the existing Vcom layer in the display instead of an additional sensor part.  Depending on the mode (IPS, FFS, VA, or TN), Vcom is not specifically patterned or segmented for display
purposes but can be segmented for the touch-sensor electrode layout.

Due to its serving both touch sensing and display driving, Vcom requires time sharing.  While it is used for display driving, it cannot support touch sensing concurrently.  Consequently, display and touch have to compete with each other for precious time resources.  Usually, one 60-Hz display has a frame period of 16.7 msec; display driving consumes 10 msec, leaving the rest for touch sensing.  The situation will become even more serious as the display resolution gets higher.  Higher resolution requires more time to deliver display signals, but this makes the time available for touch sensing less so that the sensitivity is worse.  A potential solution is to use oxide to replace LTPS as the TFT technology, a change that can shorten the time needed for display addressing because of the better electron mobility and lower leakage current.

Pixel Eyes and AIT follow the same segmented principle as Apple’s Vcom.  The Vcom layer of IPS and FFS modes is on the TFT backplane glass, but the Vcom layer for the VA mode is on the color-filter glass (bottom side).

For Pixel Eyes’ hybrid in-cell, Rx is on the top side of the color-filter glass.  While Pixel Eyes is applied on IPS and FFS modes, it uses the Vcom on the TFT backplane so that the entire structure is quite similar to the GG (two pieces of glass) type.  For the VA mode, the structure of Pixel Eyes looks like the DITO type (double-sided ITO; both X and Y electrodes are located on the top and bottom sides of the substrate).  Regarding the sensor patterning of Pixel Eyes, it requires that Rx be on the top side of the color-filter glass (similar to on-cell touch) and Tx by segmented Vcom.

Consequently, the display production cost of Pixel Eyes should be higher than that of LG’s AIT, which has only a single-layer sensor on the TFT backplane.  However, Apple’s in-cell production cost could be higher still.  Although Apple’s Tx and Rx are on the same substrate (Vcom on the TFT backplane of the FFS mode), SITO-like patterning and extremely high segmentation make its yield rate lower.

During 2015, we will see in-cell with different structures (Apple’s, Pixel Eyes, and AIT) and on-cell (AMOLED and TFT-LCD) competing with add-on types.  However, due to the issues and limitations described above, in-cell is limited to smartphone sizes.  Bigger displays usually have higher resolution and more channels, so that time sharing and routing become more challenging.  Although on-cell can be easier, a single layer is not sensitive enough for 8 in. and above.  On-cell based on SITO with at least four photomasks can be applied for notebook-PC sizes.  But considering the cost-down trend of the add-on type and sufficient supply chain, larger sizes should be the threshold for embedded types.  Consequently, we can expect that embedded-technology makers will focus on the smartphone market for the next 1–2 years.  Also, depending on in-cell-touch maturity, panel makers will probably take on-cell as a transitional technology and then adopt in-cell in the future.

Tap-Sensing and Other Pressure Technologies Enhance Touch User Experience

Despite the competition between panel makers and touch-module makers in a time of slowing market demand, the mature user interface of touch screens is not likely to bring many new surprises to end users compared with what has already occurred over the past several years.  There is, however, evolution.  In April, Apple released the Apple Watch and a new MacBook that use its Force Touch and Taptic Engine technology.

Apple’s new trackpad on the MacBook has a glass-based capacitive-touch sensor to detect the positions touched.  Force sensors (strain gauge, likely) are located at the four corners beneath to detect tapping, and the Taptic Engine (an electromagnetic mechanism) delivers haptic feedback.  The Apple Watch incorporates haptic feedback by using a linear resonant actuator; its Force Touch is a slim sensor around the display.  At press time, there were reports that Apple was developing new iPhones that use Force Touch technology.

Tap sensing is designed to enhance the existing user experience of touch and is not an additional input tool.  An extra firm or “deep” touch can issue a different type of command than a regular one, for example.  Haptic feedback simulates a “click” effect when users press the touch surface.  The technology also measures pressure sensitivity (for drawing and similar applications).  The idea behind this is that when an end user is touching the device screen, combining specific tap gestures can make the interaction handier and more useful without the need for other tools.  For example, while watching a movie, a user can speed up the playback by simply tapping the user interface (such as a MacBook trackpad) without scrolling the controller bar of the GUI.

Previously, Apple had pursued patents related to stylus use.  However, Apple seems to have had a different goal than pressure-sensing – Steve Jobs always despised stylus use.  And despite pursuing patents, Apple will not necessarily adopt styluses for its products.  Tap sensing has replaced pressure sensing as the new attraction.  Compared with pressure sensing, tap sensing does not emphasize sophisticated sensing levels and has limited precision for pressure detection.  But Apple never simply introduces a hardware feature; it usually considers how to deliver a better user experience.  And apparently it considers Force Touch, a milestone that will enhance the existing user experience of touch screens.

Conventionally, pressure sensing is used to describe a user interface that is able to detect the pressure applied.  PCAP touch sensors are fluent and capable of detecting the positions of touch but less adept at detecting pressure.  The passive stylus used for resistive and PCAP has no pressure sensing.

An active stylus with a specific mechanism can fulfill the requirement of pressure sensing.  For example, Wacom EMR (electromagnetic resonance) makes use of the embedded tiny capacitor unit in the stylus to deliver the pressure level along with the touch position.  N-trig’s solution is to convert the optical hindrance (optical shutter inside the stylus) to pressure level.  Both technologies are able to reach higher sensing levels.  Wacom’s solution, which is used for the Galaxy Note 4, can reach 2,048 levels.  N-trig’s solution, used for tablet or notebook PCs, can detect from 256 to 1,024 levels.

Apple seems to have no interest in making its iPhone or iPad like the Samsung Galaxy Note series.   Table 3 shows a comparison of Force Touch and Active Stylus features.

The combination of touch screen, tap sensing, and haptic feedback is a new milestone for the existing touch-based user interface.  Capacitive touch is still critical to detect position, but other technology such as FTIR can improve the tap-sensing experience.  Furthermore, the haptic module is adopted to make tap sensing more intuitive, with interactive feedback.  Force Touch with haptic feedback makes use of existing mature technologies and does not intend to offer users the ability of handwriting and drawing.  Instead, integration of the hardware and software is designed to help end users extend the user experience of touch.  It will be interesting to see how end users and the market react to these features in the face of ongoing competition and consolidation.  •

Table 3:  A comparison of Force Touch and active stylus features includes sensing level, purpose, and cost.   Source: IHS Quarterly Touch Panel Market Analysis Report Q1’15.
Apple’s Force Touch
Active Stylus (non-projected-capacitive based)
Sensing Level
A few with limited precision
Able to reach 256-2,048 levels
Strain gauge likely (MacBook)
Active stylus or EMR
Sensing Principle
Mechanical to electronic
Optical shutter or EMR
Input Device
Finger or any object
Specific stylus
Touch Screen
Separate part
Separate (EMR) or combined (N-trig)
Enhancement of touch UX
Handwriting and drawing
Uses for Apps
Able to be defined for apps
Limited for writing, drawing or notes
Cost Concern
< $5 (force sensor only), affordable
~$10-15 (sensor or FPC board not included)

Calvin Hsieh is a Research Director at IHS.  He can be reached at