How to Radically Impact the Mobile Experience with Haptics
The act of recreating the experience of touch for the end user is referred to as haptic feedback, or simply haptics.

Most mobile applications are developed primarily around the visual experience. In some cases, the developer may have integrated audio as well. As end users, our senses of sight and hearing are well attended to in an effort to bridge the gap between the real world and the virtual world. However, notoriously left behind is our sense of touch, perhaps the most fundamental sense, and the one first developed while still in the womb.
The act of recreating the experience of touch for the end user is referred to as haptic feedback, or simply haptics. For a long time, mobile phones integrated a vibration motor internally to provide a “quiet” mode of operation for ringtones and notifications. However, with the advent of smartphones and the pervasiveness of touchscreens, vibration as a means of providing user feedback started to evolve. Google’s Android OS was one of the first to attach haptics to key user interactions, even providing an API for third-party developers. With the introduction of Apple’s Taptic Engine, an actuator capable of producing sharp and strong vibration effects, in the iPhone 6s, haptics made a remarkable entrance into iOS. Since then, Apple complemented several iOS use cases with great haptics. Although less flexible than on Android, the iOS development kit also enables developers to integrate haptics into their apps.
The New Waveform Vibration Effect on Android O
The Vibrator API has always been an integral part of the Android development kit. However, up until Android Oreo 8.x (API level 26), there were only two types of commands that could be sent to the vibration motor: either a single pulse of full-strength vibration for a specified duration or a succession of full-strength vibrations with specified durations per pulse.
The release of Android Oreo marks the introduction of the more generic VibrationEffect. While still being backward compatible with the previously supported functionality, this new capability enables the specification of waveforms. For the first time, Android developers are now capable of controlling both the amplitude depth and the frequency range of the haptics signal sent to the vibration motor. This is a significant improvement over what developers could do previously! This is akin to the massive leap forward when desktop computers went from only being able to produce a loud beep to now being able to play any rich audio track.
With smartphone manufacturers investing in better actuators that have advanced haptics capabilities, such as LG Electronics’ LG V30, developers can now create very rich haptic user experiences. Imagine being able to feel a very faint nudge for every item that goes through a list as you are scrolling through it, or creating the tactile experience of a mechanically operated camera as Silicon Valley-based Light has done with its Light L16. Once provided with decent hardware and given complete control over the motor, the developer now has a complete toolbox for enhancing the user experience with rich haptic feedback.

The Challenge in Implementing Haptics for OEMs
The adoption of haptics at the platform level creates opportunities for mobile phone makers to expand the use of touch and increase consumer appeal. In this competitive industry, OEMs find themselves having to respond to market pressure to provide more performant haptics hardware. However, more capable motors also mean that greater responsibilities are now placed on the manufacturers to provide proper drive control, which in turn enables developers to extract all the power of the hardware. Whereas in the past OEMs chose cheaper eccentric rotating mass (ERM) or linear resonant actuators (LRA) with minimal thought put into drive control, they now find themselves investing in more expensive X-axis LRAs, piezo or solenoid resonant actuators (SRA), which require finer-grained control to get the most out of them.
Additionally, from the Android OEM’s perspective, the ability for developers to specify waveform definitions for haptics puts a greater burden on their shoulders. They ultimately are the ones that have the responsibility to provide the implementation of the new API on their hardware. For optimal functionality of haptic feedback, mobile hardware components—actuators and ICs—and software must work together efficiently and effectively. In fact, if anywhere in the chain a faulty implementation is provided, the resulting haptic experience could be dramatically undermined.
As opposed to most parts that make up a mobile phone these days, haptics is the only system that is comprised of a moving piece, i.e. the mechanically driven actuator. Let us consider an LRA motor. Vibration is created by pushing a mass connected to a spring at a rate that is called the resonant frequency that is at the rate of which the mass takes to complete a full cycle. This is analogous to a parent pushing a child on a swing: maximum amplitude is reached by the child when the parent provides a push precisely at the moment when the child is alternating direction on the swing. If the parent pushes just a little earlier, while the child is still moving backward on the swing, then effectively the parent is now applying a breaking force and stopping the child’s motion. If instead the parent provides a push just a little later, after the child has already started moving forward on the swing, then the push is much less effective, resulting in a much lower force applied onto the child’s momentum.
Similarly, other actuator types have their own intrinsic physical properties that also require know-how and experience to extract the capabilities that they offer. Several techniques exist in both hardware and software to create delightful haptic feedback. For example, one can use feed-forward or closed-loop systems to calibrate to the resonant frequency. For amplitude control, OEMs are looking for ways to minimize ramp-up time, allowing faster reach of the maximum amplitude, thus resulting in sharper vibration effects. One way to do so is to overdrive the motor by boosting the voltage for a short period of time. Similar techniques exist to stop the motor more quickly, such as using phase or voltage inversion. Proper care in applying these techniques is required to achieve the desired result, i.e. the “tuning process.” Tuning for great haptics lies somewhere between art and science and few people have developed the experience to be great at it. To minimize error and provide the best haptics experience, OEMs will often revert to experts in the field and also make use of third-party solutions.

Android’s API for creating a waveform signal for vibration.
The Challenge in Integrating Haptics for App Developers
The first difficulty application developers face is knowing how and when to make use of haptics to begin with. One can start by referring to published guidelines on that topic specifically until one gains more experience and becomes more comfortable with what the feature can provide. The bigger challenge for app developers is to actually create sophisticated haptic effects easily. Even though developers have the ability to play waveform effects, how to define them is altogether different. Imagine having to create a complex sound effect, say a long explosion, by only specifying actual PCM samples! No one in their right mind would attempt to do this, but that is what is being asked of application designers wanting to create custom haptic effects. Luckily there are third-party tools available on the market to facilitate this task. One can only hope that as haptics is becoming mainstream, additional tools and support is going to be offered to the developers and designers.
Summary
Haptic feedback used to be nothing more than a buzz you felt on your mobile phone when getting a call. Both Android and iOS have since integrated rich haptics features into their systems, and provided third-party developers the ability to do the same within their own apps. Android raised the bar further with the Oreo release, allowing developers to control both the amplitude depth and the frequency range of the haptics signal. OEMs are now racing to integrate higher-resolution actuators into their phones to meet the demand of both their end users and of the developers. Only those OEMs who invest appropriately in their hardware, and are willing to spend the time needed to properly drive and control the hardware, are going to be successful in differentiating themselves and their devices. Meanwhile, developers who want to quickly leverage haptics into their apps need to look for off-the-shelf solutions in order to easily create, manage and play back their custom designed effects. All of this is because, at the end of the day, the impact of touch is profound. As the American poet Diane Ackerman once said: “Touch seems to be as essential as sunlight.”